Challenge: improve the speed of the #curl dotdot URL normalizer function. (without doing ridiculous things)

github.com/curl/curl/blob/28d2…

#curl
in reply to Marcus Müller

@funkylab there's no existing benchmark, just this report that got me looking into this hackerone.com/reports/3463608
in reply to daniel:// stenberg://

ah. OK. (and yes, if the problem is "people get to pass unchecked complexity strings to libcurl", then that's an API consumer problem, not a problem of the lib implementing that API)

Simple solution here would *seem* (I bet the devil's in the details!) to be to have two arrays, instead of just the first

1. char array for output
2. max_substring_length_integer array for "how long is this output path component"

. Then, encountering `./` in the input just leaves both alone, and
1/2

in reply to M. Verdone

it's more of a check to figure out if we can improve. Triggered by this: hackerone.com/reports/3463608