Sasha Pachev
|
|
« on: March 26, 2009, 09:46:20 pm » |
|
Course Tool now has elevation profile graphs. Still has some quirks, but usable. Will work more on it tomorrow. Report bugs on the Support forum.
|
|
|
Logged
|
|
|
|
Paul Petersen
|
|
« Reply #1 on: March 27, 2009, 06:04:21 am » |
|
Looks good, great job! Does that mean that graphing in on the horizon for other blog features, such as mileage vs. time?
|
|
|
Logged
|
|
|
|
Mike Davis
|
|
« Reply #2 on: March 27, 2009, 01:16:41 pm » |
|
Thanks Sasha, This is an excellent feature!!
|
|
|
Logged
|
-Mike
Running without hills is like motorcycling without corners.
|
|
|
Sasha Pachev
|
|
« Reply #3 on: March 27, 2009, 04:01:54 pm » |
|
Yes, Paul, I am planning on it. The big hang-up was to find a suitable graphics library. I finally decided to give PEAR Image_Graph a try. Let's hope it does not overwhelm the server.
I think you are going to like the next feature - mile gradients. The cool thing about Image_Graph is that now I can do just about anything you can with your tools except it is programmatic and thus auto-magic.
Also, the graph revealed a weakness in my smoothing algorithm for the courses. They still do not look right even after smoothing. Not surprising. The algorithm I have is rather dense. For each datapoint, if the elevation is high enough to make the grade from last to current higher than the crazy cutoff, then throw out that elevation and move to the next. Keep moving to the next until the grade from last good elevation is less than the crazy cutoff, then assume even elevation change to the current datapoint and replace the bogus elevations in the past points with the estimates.
So with that we take Ogden. If you do splits of 0.05, you see that from 1.2 to 1.35 the course climbs at about 2% grade, then from 1.35 to 1.4 it drops at 13.84 % grade, then from 1.4 to 1.75 it climbs again at about 1% grade, and then we drop at 3% grade for 0.05 and climbs at 3% grade for another 0.05. This is obviously wrong, it does not do that, but what does it do?
Smoothing with 4% crazy grade. The same stretch now has a climb at about 2% from 1.2 to 1.35, then a drop at 3% to 1.55, then a climb at 1% to 1.75. More reasonable, but I think it is still wrong. Ogden never climbs for more than 0.05 at 2% in the first 5 miles. If it did I would have noticed it. It would have hurt.
What I think it does. Assuming 5306 ft at 1.2 is correct. We probably have 5310 instead of 5312 at 1.25, 5313 instead of 5318 at 1.3, 5315 instead of 5324 at 1.35, then we gradually drop to 5300 at 1.65, and then start climbing a little bit. We never descend to 5287.
The big question is how to convert this intuition into an algorithm. Some ideas:
Use road construction guidelines. Are there any regulations that say how much a grade can change within a certain short segment when you build a road?
Paul - have you ever heard about ditch elimination algorithms in your GIS work?
Maybe try Bezier curves?
Any other ideas?
|
|
|
Logged
|
|
|
|
James Winzenz
|
|
« Reply #4 on: March 27, 2009, 04:13:26 pm » |
|
I noticed that today when I added a new course - it looks awesome! I did notice, however, that there were several lines of errors when I attempted to edit a course (I tried the fix crazy grades option, and also got this at other times) using IE 6 (at work):
Warning: Division by zero in /home/asksasha/web/running-log/libcourse.php on line 269
Also got:
Warning: Cannot modify header information - headers already sent by (output started at /home/asksasha/web/running-log/libcourse.php:269) in /home/asksasha/web/running-log/trainlog2.php on line 1230
Warning: Cannot modify header information - headers already sent by (output started at /home/asksasha/web/running-log/libcourse.php:269) in /home/asksasha/web/running-log/trainlog2.php on line 1231
Not sure if this is related to IE or not - I can check with FF to see if I get the same errors.
Note - just checked with FF - got the same errors when trying to fix crazy grades. Does not appear to happen anywhere else.
|
|
« Last Edit: March 27, 2009, 04:17:06 pm by James Winzenz »
|
Logged
|
|
|
|
Sasha Pachev
|
|
« Reply #5 on: March 27, 2009, 04:35:37 pm » |
|
Didn't they teach me in school that dividing by zero is a bad idea?
I've fixed the bug (I hope, since I did not have a test case). This could happen when two data points overlapped each other. I tried to make it happen in a test, but could not.
|
|
|
Logged
|
|
|
|
Paul Petersen
|
|
« Reply #6 on: March 27, 2009, 04:35:58 pm » |
|
Good stuff.
Regarding smoothing, I usually smooth profiles simply by reducing the number of nodes. This drastically lowers the chance of something crazy sneaking in. By "nodes" I mean the number of data points. I usually generate tenth-mile postings for a race like a marathon or half marathon. So a marathon will have just over 260 elevation points. This is dense enough to show any hill of consequence, but sparse enough that most of the weirdness drops out. In addition, people want to see a nice, smooth profile, as it is more pleasing to the eye, and better reflects what they remember during the run. Tenth-mile or even quarter-mile intervals best accommodate all this.
Sometimes elevation model errors are unavoidable. For instance, the USGS DEMs are always bare-earth models. That means that human-made things such as bridges and buildings are removed from the model, and it thus represents the elevation of the ground surface. The problem is that we don't always run on the ground surface (ie - when on a suspension bridge). Sometimes one of my profiles will take a dive that isn't really there. Sure enough, when I look at the map, the runner is going over a bridge at that point. The only way around this is for me to interpolate the true elevation via the manual method (type it in). Human editing, care, and customization - these are the things that keeps me in business in this new world of internet auto-magic.
Sometimes the DEM is all kinds of messed up and I am forced to smooth it. For this I use a Focal Mean algorithm that averages cell values based on the values of its neighbors within a roving user-specified window (ie - a 4x4 cell window). This works reasonably well, but it will tear down peaks and raise up valleys. (to use biblical language). I have not tried Bezier curves.
Finally, there is quite a bit of error simply in the Google aerial photos. You get what you pay for sometimes. Remember that you are using completely undocumented photography from unknown sources and unknown accuracy. The orthorectification of these aerials is quite suspect at best, and I see quite a few shifts and edge-match problems on a regular basis. What this means is that the USGS DEM's and the Google imagery will not line up perfect at times, which will lead to poor results in areas of high relief and steep terrain (ie - Ogden Canyon). So it's not necessarily DEM error (which does certainly exist), but probably more aerial photography error.
So if I were you, I would reduce the node density. Just re-sample the spacing to 0.1 or 0.05 miles or something like that. Or give the user the option to set their spacing.
|
|
|
Logged
|
|
|
|
Adam R Wende
|
|
« Reply #7 on: March 28, 2009, 07:41:25 pm » |
|
WOW! Paul, that sure gives me a new appreciation for what you do.
|
|
|
Logged
|
|
|
|
Paul (RivertonPaul)
|
|
« Reply #8 on: April 21, 2009, 04:19:30 pm » |
|
Cool feature and discussion.
|
|
|
Logged
|
|
|
|
|