[langsec-discuss] Tainting input for better security
will.sargent at gmail.com
Fri Jun 15 04:48:37 UTC 2012
-----BEGIN PGP SIGNED MESSAGE-----
On 6/14/12 3:05 PM, Michael E. Locasto wrote:
> On 6/14/12 3:05 PM, Carter Schonwald wrote:
>>> What that means for me practically is that any input I have
>>> from the client, in the form of headers, parameters, cookies or
>>> form fields, should be automatically tainted by the system
>>> unless an appropriate validator is found for it.
>>> 1) Why don't all web frameworks do this out of the box?
> How deeply do you want to track/propagate this taint into the
> application logic? You'd need a part of the runtime system that
> does this implicitly for most data movement, copy, and
> transformation primitives. The performance impact may be
> prohibitive on a case-by-case basis.
Not really -- not if you did it all using a type system. At that
point you're not looking at a performance impact because an API that
doesn't take tainted input won't even compile.
There's a couple of different idioms for validations that I know of --
I think the Scalaz validation is closest to Haskell: http://bit.ly/NCANqG
> I suppose you're arguing for validation at the point of input, but
> I suppose it is possible for two systems to be composed in such a
> way that input D passes through the validation for the first
> system, is somehow transformed by that system, and then is passed
> to the second system to consume (and exploit). Alternatively, the
> composed & transformed input could pass both validation phases, but
> still exercise a vulnerability that emerges from the composition of
> the two systems.
What I got from Meredith's talk was that validation should happen at
the interface of a system, and be enforceable at that boundary. More
broadly, the richer and more implicit the input to the system, the
harder the time you're going to have validating it, which is while
validating UUID is easy and validating HTML is near impossible. If
you can decompose and transform input into the simplest bits possible,
it makes it harder to break out and create a "weird machine."
> I think internal processing has a tendency to diverge from the
> validation logic, which is another potential source of failure.
It always will -- the key is to separate, encapsulate and validate the
boundaries between systems, so that they protect and border and greet
>>> 2) Why is validation in such a terrible state? It seems like
>>> people just throw regexps at the problem and hope for the
> Because a traditional CS undergraduate education introduces parsing
> in the context of recognizing computer languages rather than
> validating input arguments, function parameters, protocol messages,
> or file formats?
That sounds about right. I really wish that Parnas's talks on
encapsulation were required reading by all undergrads...
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.12 (Darwin)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/
-----END PGP SIGNATURE-----
More information about the langsec-discuss