Register | Login
Attackpoint - performance and training tools for orienteering athletes

Discussion: Mapping

in: Orienteering; Training & Technique

Nov 25, 2013 2:32 AM # 
AddisonB:
Anybody have any suggestions on how I can start to map Marydale in Erlanger Kentucky? It's right across the street from St. Henry. Anything would be highly appreciated.
Advertisement  
Nov 25, 2013 2:43 AM # 
tRicky:
Get an aerial photo in either JPEG or TIFF format and a copy of OCAD. See if you can georeference the base photo since this'll make it easier to GPS anything.
Nov 25, 2013 3:44 AM # 
jjcote:
Talk to eddie.
Nov 25, 2013 6:46 AM # 
Terje Mathisen:
Do you have any Lidar available in that area?

If so, then I'll recommend my article on basemap generation based on LAS/LAZ files.
Nov 25, 2013 1:15 PM # 
Spike:
Consider using Open Orienteering Mapper (as a free alternative to OCAD):

http://sourceforge.net/projects/oorienteering/

The National Map Viewer includes orthophotos of the area at 0.15 and 0.3 meter resolution. You can download the photos easily (and for free):

http://viewer.nationalmap.gov/viewer/

I'd also second the suggestions of talking to Eddie about lidar basemaps.
Nov 25, 2013 2:44 PM # 
cedarcreek:
The NKAPC has quoted $40 per tile (or "quarter-tile", I can't remember exactly) for lidar, which is insane seems excessive. (In the past, they have given it to me because OCIN is a non-profit and it was a large, recognizable park.) The State of Kentucky has downloadable, lidar-derived DEMs (for certain areas, including Northern Kentucky) that are free, but you can't do the processing that makes lidar so good.
Nov 25, 2013 2:59 PM # 
cmorse:
If you have access to decent lidar data, Kartapullautin will churn out a pretty decent base, and dxf contour information can be imported into OCAD or OO Mapper. But like others have already said, if you don't want to try and put all the pieces together yourself, Eddie can pull it all together for you and can tweak the inputs for best orienteering relative results.
Nov 25, 2013 5:03 PM # 
AddisonB:
Okay thanks everyone. I'll let you know how it goes.
Nov 26, 2013 10:22 PM # 
eddie:
Addison, I made a quick basemap for you. Files to download and some notes are here.

As Matthew pointed out, the KY bare earth lidar is available for free download as gridded DEMs (5ft/pix, which is about 1.5m/pix). I was curious, so I had a look. The quality is excellent, as the underlying lidar collection was done at better than 1m avg posting. You can probably get by with just the bare earth DEMs here and color orthos (left as an exercise for you to download :)

Check the notes on the above page for info regarding file format conversion. The DEMs are in ERDAS IMAGINE (.img) format. GDAL is awesome for handling these.
Nov 26, 2013 11:01 PM # 
Tundra/Desert:
Eddie, FYI you can contour in GDAL, too. But there are no knobs to twist to get the smoothing, one size fits? all.
Nov 26, 2013 11:32 PM # 
bubo:
Unless thereĀ“s something wrong with my download procedure the Marydale base map is not oriented North-South and also mirrored?? (at least when comparing to Google Maps)

/ Curious George ;)
Nov 26, 2013 11:37 PM # 
eddie:
Nice. And it would work directly on the .img files. To get the indexing you could make multiple calls to gdal_contour.

You could smooth the .tif beforehand, but I can't see exactly where in the gdal tools you'd do that. You can do some smoothing once in ocad as well. I smooth in two stages: once on the gridded bare earth (boxcar) and once during the vector to bezier curve conversion.
Nov 26, 2013 11:42 PM # 
eddie:
Whoops! :) Bubo, you're right. I forgot to flip the .tif N-S before contouring. Hang on...
Nov 27, 2013 12:41 AM # 
eddie:
Ok, its fixed. How embarrassing :) I had even checked the scale twice against google maps and didn't notice the flip. Bubo, please check the scale.
Nov 27, 2013 12:59 AM # 
vmeyer:
Curious George can be quite useful, eh?!
Nov 27, 2013 8:49 AM # 
Jagge:
Pullautin first searches knolls, then it smooths the grid by steepness. Like this steeps parts stay sharp - no grid smoothing needed there - and flat parts can be smoothed more to get less of that zigzag contour noise without actual knolls getting smoothed away. Contours are smoothed too, again by steepness - for the same reasons. It took some time to get it all somewhat balanced and there is still room for improvements. But it doesnt look bad now when it makes 5m + formlines http://maps.worldofo.com/webroute/?id=637

But for basemaps most of that may make things just worse. Pullautin is sort of trying to do field checkers job there. If there will be field checking the map should be done for fieldchecking, better do just light smoothing and let the mapper decide how and where things get smoothed and how much.
Nov 27, 2013 4:13 PM # 
eddie:
Its definitely a trade-off. The field checker can either smooth ziggy raw contours, or re-sharpen overly smoothed features. It boils down to S/N, signal being orienteering-significant features and noise being lidar error + real surface roughness.

Smoothing is spatial filtering - in the case of contours, throwing away the highest frequency noise (and signal - like small terrain features) and contouring on the remaining lower frequencies. The unsharp-masked versions of the bare earth surface I make are the high frequency component of that same filtering. Contour on the data below a given spatial frequency cutoff (low-pass filter), return data above the cutoff as an image and use that as a template to view the high frequency features (high-pass filter, the unsharp mask). Jagge has done some great work with auto feature extraction in Pullautin.

You can smooth the unsharp mask image a bit to remove the very highest freq. components if you think they are noise, or if you're simply not interested in features smaller than say 0.5m.

Jagge's variable smoothing kernel is the proper way to ensure the filter has a similar spatial extent (spatial period, horizontally) everywhere across the image. A fixed boxcar kernel over terrain with varying steepness will have a variable frequency cutoff, smoothing over a larger land area where the ground is tilted than where it is flat.

But the choice of boxcar size is a gray area. Jagge, how much do you change the smoothing kernel between typical steep and flat areas? I've found that a 7x7m boxcar before contouring is about right on 1m gridded data, regardless of the original lidar post spacing (again, smoothing the resulting contours a little more in the next step). However I use a 5x5m box for making the unsharp mask. So a very small spread (difference) in the freq cutoffs used for the two components.
Nov 27, 2013 6:20 PM # 
Bernard:
Related to this thread: the NY state orthographic images are in JP2 format (jpeg 2000) which is unsupported in OCAD for backgrounds. Can anyone point me to a reliable - free - conversion program ( something that could do JP2 to TIFF for example)?
Nov 27, 2013 6:40 PM # 
cmorse:
Imagemagick should do the trick - never used it on Windoze or Mac, but assume it will work the same.

http://www.imagemagick.org/script/index.php
Nov 27, 2013 6:40 PM # 
Tundra/Desert:
GDAL will do it just fine, preserving georeferencing if it's present. Older OCADs only understand uncompressed TIFF. So for a template openable in an older OCAD, it would go like

gdal_translate -co COMPRESS=NONE prettypicture.jp2 prettypicture.tif
Nov 27, 2013 6:46 PM # 
eddie:
Newer versions of Preview (on a mac) can read and write them. V 5.0.3 can for example.
Nov 27, 2013 7:20 PM # 
Tundra/Desert:
With georeferencing retained?
Nov 27, 2013 8:36 PM # 
Jagge:
If you use something like imagemagick you will have to extract georeferencing with some other tool, like exiftool. So better use gdal.

eddie, I use 2m grid (with our typical 0.7 points/m2 there is no much point using 1m grid) and 5x5 box for smoothing (10x10m), that's twice as large as yours. Box isn't variable, it's fixed. Entirely flat parts gets 100% smoothed value and at certain steepness or steeper it gets original value and it between it gets is steepness weighted average of those two values. The objective is just smooth noise away from flat areas (usually marshes). Plenty of room for improvement there. Most of the smoothing is done for contours, and little sharpening too. Might be better to do more and smarter smoothing/sharpening to grid
and less to contours.
Nov 27, 2013 9:06 PM # 
Jagge:
For eddie:
http://routegadget.net/misc/et.gif
I would not mind hearing reasons I am getting this or that wrong and how I might get it fixed. Just trying to learn from the master...
Nov 27, 2013 9:50 PM # 
igor_:
Is there a lidar file for this earls trails quad, I'd like to run my processing to see what comes out.
Nov 27, 2013 9:55 PM # 
eddie:
The Amherst lidar data is online here. Earl's Trails is at bottom center.
Nov 29, 2013 3:33 AM # 
AddisonB:
This is awesome!!!!!! Thank you so much y'all. Have a wonderful thanksgiving!
Nov 29, 2013 6:07 AM # 
tRicky:
Okay.
Nov 29, 2013 8:25 PM # 
cedarcreek:
eddie---How did you create the unsharp mask image that looks like a slope image? This one.

All---One thing I don't understand is how to take tiled images and convert their projection without getting these black wedges on the sides. In another thread, (maybe this one) T/D (?) mentions using gdal_translate (and maybe gdalwarp) to stitch together a set of tiles, re-project them, and re-tile them into the new projection, but I also see mention of using regular stitching programs (such as for panoramas) to do the stitching part.

Just a quick story:

About a month ago, I worked all night getting a basemap for Vladimir Zherdev to use for Carter Caves, a state park in Kentucky. No lidar. We had two sources of contours. (1) A 1970 engineering map of the park at 1"=200 ft, and (2) DEMs (not very good) that followed the 7.5 minute USGS map series "tilings". I downloaded a "2-week free trial" of GlobalMapper, because so many people mention and recommend it. We also had about 45 downloaded aerials from Pictometry (orthorectified). We had previously figured out that the jpgs didn't download automatically with the jgw "world file", so David Waller had to go back into Pictometry and download those. (The aerials are recent, leaf-off, and quite good.)

The Pictometry aerials are georeferenced to "raw" (my term) WGS84 in longitude/latitude only, and it turns out that OCAD 11 doesn't recognize that projection---apparently it needs a UTM-type grid in meters. If you've got KY-State-Plane in feet, it will load, but not be right, and my normal kludge for convertiing feet to meters in OCAD 8 screws up the georeferencing.

So I went to my *2-week free trial* of GlobalMapper, and started converting the Pictometry images to UTM zone 17N (N for northern hemisphere) in meters. Tried a few, they loaded, but had black wedges. I figured that would be easy to figure out later because unlike tiled images, the Pictometry aerials overlap generously. So I try to convert all 45 aerials. Boom. I have exceeded the "2-week free trial" limit on conversions---dizziness---blood pressure spike---acceptance---thinking I might have to have the club drop $450 on GlobalMapper. Read a few attackpoint threads and decide to download QGIS, a *free, open-source* GIS program. I'm thinking, crap, this is going to suck. I download the 160MB file, install it. I've got five or six new icons on my (windows 7) desktop. Again, crap. I pick one that looks likely, "QGIS Desktop". It was the right one. I'm going to skip ahead. It wasn't painless, but it was nearly painless. An hour or two later, I've got 45 aerials projected in UTM 17N. (That was including the learning time---it's much faster if you know what you're doing.)

I realized the 7 sheets of the engineering map had a 1000 foot grid, with grid eastings and northings, and the tiles were marked with eastings and northings in feet in the Kentucky State Plane South. I had helped Mike wrangle the sheets on a copier, and I had 30 scanned sheets with enough grid crossings to use for georeferencing. I had seen a georeference tool in GM, and I started to use that---and I may be forgetful here---but I ran into the conversion limit before I georeferenced one map image. So back to QGIS. I'm going to again skip ahead and say that every time I thought "Crap, QGIS, 'free, open-source'", it ended up working. It wasn't easy, but dammit, it totally came through for me. If you guys who do command-line GDAL work can't easily find *all the state plane references* then get QGIS, because it is a front-end for GDAL. You just pick the projections and other difficult commands from dialog boxes, and QGIS actually creates (and displays) the gdalwarp or gdal_translate command it is going to use, *and* it lets you add typed commands if you need them. It's not amazingly user friendly, but again, it totally came through for me.

So I start using the georeferencing tool in QGIS. It appears to be working, so I spend about 2 hours doing them all. There was a click-box option to save the points for each image (a letter-sized (~A4) page (in jpg format) from the copier of the engineering map). So I load it in OCAD 11, and then add the aerials, and there is an error. It took me forever to figure this out. Well, say six hours. The engineering map images were loading over 100 miles away from the Pictometry aerials. The problem was that the 1970 map, that said it was in the KY S plane, wasn't. It was KY North. When I changed one image and tested it, I could immediately see that the UTM Eastings and Northings were right, but I can't tell you how cool it is to load it over the aerials, and just see that they're perfectly aligned. (Full disclosure---It did take some time to figure out the settings. Linear was not the right answer. It needed some heavier math to get the grid lines to align properly. Amateur-tip: If your map is black and white, don't scan it in black and white. Create jpgs in *color*. QGIS and the gdal utility act on the three color bands, and if it will work on a black-and-white scan, I couldn't figure out how.)

The engineering map only covered the park, but not the state forest, so I needed contours for that. Luckily, they're not as crazy as those inside the park, and I found four USGS-sized DEMs, created contours (using the QGIS-front-end to a GDAL utility), and they loaded basically perfectly over the Pictometry aerials.

Then we had problems printing from OCAD 11 into the copy shop's commercial laser printer because the OCAD 11 free trial version didn't have enough RAM in the copy shop computers. So I had to export PDF files, one page at a time, transfer them to the copy shop computers using a thumbdrive. It took about four hours to print two basemaps, about 50 11x17 (~A3) sheets.

Oh---the black wedges. Because I had so much overlap in the aerials, I was able to use a QGIS tool to just crop out the black area and create a new TIF with a proper georeferencing. One problem I didn't figure out was how to control the output to other than TIF. It wasn't obvious how to get the output to be a jpg with world file rather than the geotiff. The problem with the geotiff was just that they took up so much space on the thumbdrive. I'm assuming they all load into memory with similar RAM requirements. (Another amateur-tip. My computer has 4GB of RAM. I could not have done this basemap project with 2GB without serious workarounds. 2GB should be fine for smaller projects.)

I wouldn't say QGIS is perfect. A lot of times I had to go through the dialog boxes 30 times for 30 conversions. A few times it would let me do one set up and then convert an entire directory. But here is what it did: It downloaded and installed without any hassle whatsoever. It had literally thousands of projections with a convenient search feature. It did the job, and it was free. Sometimes I had to repeatedly search over and over because a selection wasn't sticky, and sometimes you couldn't click on the projection to pick it, you literally had to type it. It's not terribly easy to use and it has quirks, but I'm so so impressed by it anyway. I'm going to try using QGIS's easy list of state plane references when I use lastools.

GlobalMapper. What kind of "free, 2-week trial" has a limit of ten conversions? Seriously? I've heard good things. I don't want to be too obvious here, but one of the people I spoke to recommended GlobalMapper despite the fact that (this person) used to sell GM until the company that bought GM clawed-back the selling rights. "This person" even offered to convert files for me using a legal version of GM. (I'm using "this person" just to be non-attributional---they've been a great help). At that point, I'd already gotten through the learning curve for QGIS. I've got nothing against GlobalMapper or those of you who have purchased it and love it, it's just that I'm not feeling like the company that owns GlobalMapper really cares about my business. (10 conversions! You cannot be serious. I reached the limit less than 2 hours after I installed the *2-week free trial*, and literally after 15 minutes of use.)

Yes, this is a QGIS-love post. For those of you with OCAD 11 Standard, it will convert shp files to a dxf or other other formats importable by OCAD 11 Std. (Pro will import a shp file as long as it's not long-lat (I think).) If any of you contribute to open-source projects, QGIS might be worthy of your efforts.
Nov 29, 2013 8:41 PM # 
cedarcreek:
Also, for those of you who use Jagge's Karttapullautin program, QGIS is an easy way to view the tiled outputs. All you need to do is create a new project, then click on the button on the left that looks like a checkerboard (add raster layer), and point it to your jpg or png files, *not the world files*. It takes less than a minute.

Even the zoom buttons in QGIS are kludgy. The plus and minus and hand (for panning) work. I'm used to a "zoom-extent" or "view entire map" button, and that doesn't seem to work. It might be that 90% of what I've done is a raster layer.
Nov 29, 2013 9:07 PM # 
cedarcreek:
Some other mentions of QGIS on attackpoint: https://www.google.com/#q=site:attackpoint.org+qgi...
Nov 30, 2013 1:19 AM # 
Tundra/Desert:
Glad someone believes in georeferencing.
Dec 2, 2013 6:17 AM # 
cedarcreek:
Honestly, it's the path of least resistance. For a project this size, doing it without the georeferencing tools would be *a lot* more work. And I kinda know what's going on, but this was the first time I've done most of the things I mentioned in my long post.
Dec 2, 2013 7:01 AM # 
Tundra/Desert:
Some believe georeferencing is only for those who can't help themselves.
Dec 2, 2013 9:55 AM # 
Terje Mathisen:
Georeferencing is the only proper way, it is _so_ much nicer when everything, including GPX files from hikes & survey trips, all match up. :-)

I have made another little breakthrough in my quest for the best possible base maps:

I start by processing the non-ground lidar points, classifying the height distribution of returns around a given 2x2m pixel area. I compare this distribution with a number of benchmark points, picking the closest match as the most likely candidate.

(This can result in white, yellow, green stripes, yellow+green stripes, or light/normal/dark green.)

Next is the lowpass filter where I pick the most common classification for a weighted circle around the current point. For this process I first consider all the green types as a common type, then pick the best subtype.

The new piece of the puzzle have been to convert these GIF images to vector data, filtering the jagged edges and converting each patch of vegetation to a separate OCAD area object, possibly with one or more embedded holes.

I import these calculated vegetation objects as duplicates of the normal OCAD objects, i.e. 410.001 instead of 410 etc, so that it is easy to know the difference between the base map and what the surveyor has verified.
Dec 2, 2013 12:16 PM # 
jjcote:
Don't misinterpret wnat Swampfox said. He didn't say that georeferencing isn't important. He said that lack of georeferencing is not the biggest problem with a lot of modern maps. Georeferenced crap is still crap. A non-georeferenced quality map is still a quality map, same as it ever was. A georeferenced quality map is best, and he said in his first line, "it's good to change with the times".
Dec 2, 2013 2:46 PM # 
Tundra/Desert:
A non-georeferenced quality map would have been georeferenced by someone with a touch of ability years ago. The reason a map hasn't been is most likely because the distortions in it prevent georeferencing to any degree of certainty; you pick two points, calculate your reference, it's only good in a limited area, everything else is off. You can repeat the two-point procedure all you want, same result. When you feed this product to runners, some will be able to cope better and some worse, but you still aren't testing them fairly if a 250 m route left of the hill is 220 m in reality and a 250 m right of the hill is 280 m.

Mappers who believe in their eyes more than in hard data are likely to generate distortions. And, with all respect, a distorted map fits the definition of crap.
Dec 2, 2013 3:59 PM # 
jjcote:
Distorted maps are crap, I don't think anyone disputes that, especially to the degree that you're describing.
Dec 2, 2013 4:23 PM # 
Tundra/Desert:
I am fighting the F9, J-J. And the mentality that comes with it that goes like, "if things look OK to me locally, it's fine to F9 the crap out of this perfectly orthorectified base to fit my imagination".
Dec 2, 2013 4:40 PM # 
Terje Mathisen:
It is OK to take an old map with skewed base material and then use rubber banding to force it to match up with a brand new set of laser contours and buildings/roads vector data, so that you can use it as a base map for the new map you really need to produce.

One of the areas I have worked on recently had an old but pretty good map that had been mapped without properly orthocorrecting the aerial images, so that the higher the elevation, the larger the error.
Dec 2, 2013 6:10 PM # 
jjcote:
I'll agree that, unless you know that the base is junk, you should not distort it. In the early days of 0CAD, all you could do to a template was to rotate or scale it (isotropically), and I think that was appropriate.
Dec 2, 2013 6:17 PM # 
Tundra/Desert:
No, my claim goes further. I say that neither translating the origin, rotating, nor scaling is appropriate for a georeferenced base. If the sole reason for F9'ing is because the orthorectification was not done properly, find a better photo. I don't see errors greater than 2 m for Google vs. Yahoo! vs. Bing vs. USGS vs. the truth recently, and 2 m should be good enough even for a Sprint map. But some mappers will still F9.
Dec 2, 2013 6:24 PM # 
jjcote:
Yes, but the basemaps back then were not georeferenced. They arrived on mylar in a tube, and there was no ground truth to speak of. Rotating the scans of the various pieces so that they matched up (because they didn't go on the scanner with exactly the same alignment) was what the template alignment was for. If you have georeferencing, then yeah, you obviously don't want to throw out that information. Rubbersheet rarely makes any sense, except in the oddball case that Terje mentions (in which you're making an old twisted map line up with known good data).
Dec 2, 2013 6:36 PM # 
Tundra/Desert:
Right, and I'm not talking about back-then; I'm talking about now. There are still people who F9 georeferenced photos.
Dec 2, 2013 9:47 PM # 
jjcote:
I am in agreement, that's entirely the wrong way to do things.
Dec 2, 2013 10:56 PM # 
gruver:
At least as big an issue is those who don't see the need to fix their "old faithfuls" at all, I reckon.
Dec 2, 2013 11:02 PM # 
TheInvisibleLog:
QGIS +1
Dec 2, 2013 11:56 PM # 
Pink Socks:
Ok, I'd like to tackle some small personal mapping projects (one at ISSOM 1:2000, 1m, one at ISSOM 1:5000, 2.5m), and I figure I might as well learn how to do this stuff, and it seems like the smart people are already in this thread.

Here are the drawing programs that I have:
--OCAD9
--Open Orienteering Mapper

The areas that I have are covered by LIDAR (Puget Sound LIDAR Consortium), and the following are available:
--Bare Earth - 3-foot raster resolution
--Top Surface - 3-foot raster resolution
--Bare Earth Point ASCII files
--All-Returns ASCII files
--LAS files

I'd like to start by just generating the contours, and it seems like there are a few options here? OL Laser, Kartapullautin, GDAL, LAS tools, QGIS, more?

Which of these do I need to get?

Vegetation data would be nice to have, but considering how small the areas are, it's not a huge deal if I can't get that to work out.

And then there's the georeferencing bit. I've seen reference to state planes, DEM grid size, etc, so I want to make sure I get this right. I've also seen references to orthophotos, too. The areas I'm looking at have much better resolution on Google and Bing then they do on the National Map. How much difference is there between orthophotos and Google/Bing?

Thanks!
Dec 3, 2013 3:47 AM # 
cedarcreek:
Probably the easiest way to get contours would be to get the DEM (which is what I think your "Bare Earth - 3-foot raster resolution" is.) Get QGIS. Create new project and set it to the projection (CRS) of your DEM. Load raster layer and point to the DEM file(s). There is a setting for enabling "on the fly" projection changes. Change it to your UTM N zone (T/D says California is 10N. There are several UTM 10N choices, but I've been picking those with WGS 84. Obviously, southern hemisphere people will use a UTM S zone.) You could choose another projection. The default UTM uses eastings and northings in meters, so that will be changed automatically. So now your Es and Ns are in m, but the elevation is still feet. Use the "Raster Calculator" to multiply the elevation values by 0.3048 (exact feet to meters conversion factor). Google "QGIS raster calculator" if necessary. Then find the contour tool: Raster, Extraction, Contour. You will specify a contour interval and file name, but also get it to load into the project when it is done. The contours are shp files. Right click on the contour layer(s) and export using "Save As" and select dxf (autocad format). Cross your fingers and hope they will import into OCAD 9. You can clean up the layer list by right clicking and "removing" layers as you don't need them anymore.

I'm not 100% comfortable with the "allow on-the-fly projection changes" because sometimes you need to create files that are changed, but sometimes you don't care---you just need the final product, not the intermediate steps. You can use the Raster-Projection-Warp selection to batch process a directory of DEMs into UTM. I haven't tried doing the elevation conversion as a batch.

I recommend Karttapullautin as a first lidar tool---it's probably the best-looking output with the least amount of work. You'll need lastools to use it with las files (I think this is still correct). OL-Laser is also easy to use, but both have a high hurdle to get started. They're a bit intimidating. The first big problem is that you really need the las files to be in meters for both, and that's command line las2las.exe stuff. I processed a few .las files in OCAD 11---Mostly making contours---I can't remember. Still, the issue was first having to convert the units to meters.
Dec 3, 2013 4:10 AM # 
cedarcreek:
I just sent Pink_Socks a dxf I made as I typed the post above, and Gmail actually created a thumbnail preview of the dxf file showing part of the contours. So Gmail can read it---let's see if OCAD 9 will.
Dec 3, 2013 9:31 AM # 
Terje Mathisen:
I would really like to see more people tackle that command line/lastools hurdle and then go directly to the raw LAS/LAZ files!

My batch pipeline starts with those files (as you note, in 1m UTM coordinates) and then it can construct contours with depressions and dot knolls, slope and cliff images, a DEM (for OCAD 10+), as well as a tuneable vegetation classification.

The vegetation data can be finetuned after a couple of surveying trips by entering benchmark coordinates for "typical" green/light green/dark green/white/green strips etc and then redoing the vegetation stage.

I can also convert those vegetation images into proper OCAD vector area objects, making them far easier to edit.

However, as you note the initial hurdle is to get comfortable with lastools and the command line...
Dec 3, 2013 5:09 PM # 
Pink Socks:
Thanks for the info and email, Matthew. Terje, I read through your article and might tackle that process later. I have to work in baby steps here. I'm spending 5 days over New Year's within 1km of these maps, so at the latest I'll be doing this process then (I've got a lot of other stuff to work on before then).
Dec 3, 2013 10:53 PM # 
cedarcreek:
Terje and all---I just want to explain the situation as I see it. I'm not sure I've got it right. The path of least resistance for processing lidar data (las/laz/xyz) that is in feet is to simply convert the values in feet (or survey feet, apparently they're different) to meters and to ignore the georeferencing. In old versions of OL-Laser, I could even do the processing in feet with a grid setting of 3. OL-Laser thought it meant 3m, but the data was in feet, so it was actually a bit less than 1m. I did vegetation grids in 6 or 9 or 10 feet settings. But again, the new version really prefers meters, and doing a simplistic feet-to-meters conversion loses all the georeferencing data. It might be worth it for a single tile, but beyond that it's just a pain.

The "command line problem" is that if you simply convert the projection of a las tile, the data is no longer orthogonal to the x- and y-axes---it's skewed. From what I can tell, the lidar processing apps don't like that. What has to be done is to merge all the data together (lasmerge), convert to the new projection (las2las), clip it if necessary (lasclip or lastile with a -clip command), then (for large areas) retile the data into las/laz/xyz tiles (lastile) that have sides orthogonal to the new projection's axes (and possibly picking a tiling scheme (offset?) that eliminates long, skinny rectangles). And you have to decide whether to use the buffer command that lets the new tiles have a thin strip of data around them to allow better matching of the edges of the output. Sometimes the data is uncharacterized, or the characterization done by the government was intended to minimize hassle to them rather than to make the best orienteering map possible. Sometimes the data is filtered "bare earth only", sometimes it's two data sets, one for first returns and one for bare earth, and sometimes it's just single las files that work without any hassle at all.

I just checked a book out of the library, "Basic GIS Coordinates, Second Edition," by Jan Van Sickle, CRC Press, 2010. I haven't read this book. Here are a few quotes. Some of these make my head hurt, like NAD83 changing and WGS84 not:

"As refinements are made to NAD83, the new adjustments are added as a suffix to the SPCS83 [State Plane Coordinate System 1983] label. For example, SPCS83/99 would refer to state plane coordinates that were based on a revision to NAD83 from 1999."

"The conversion from meters to US Survey feet is correctly accomplished by multiplying the measurement in meters by the fraction 3937/1200." (This is not a quote: So---US Survey Feet are the original foot, before 1 inch was defined as 25.4mm? And the new foot is the "international foot"? From m to feet, the international conversion is 0.3048, and the 1200/3927 "US Survey Foot" factor is 0.30480061, so this might not matter much for elevation, but it might be off something like 8 inches (20cm) in eastings and 3-4 feet (~1m) in northings...)

"...[T]he official native unit of SPCS83 coordinates is the meter. However, reporting in feet is often required. Many states prefer US Survey feet: Nebraska, Wyoming, Colorado, California, Connecticut, Indiana, Maryland, North Carolina, and Texas. Other states prefer international feet: Arizona, Michigan, Montana, Oregon, South Carolina, and Utah. Still others have taken no official action on the issue. Nevertheless, clients in any state may request coordinates in either format."

"Another common problem stems from the periodic readjustments performed by NGS. As mentioned in Chapter 2, NAD83 has been subject to refinements since it replaced NAD27. These improvements are largely due to the increasing amount of GPS information available and are denoted with a suffix, such as NAD83/91, the latter number referring to the year of the readjustment. Since SPSC83 is based on NAD83, these readjustments result in new state plane coordinates as well. It is therefore feasible that one county may use NAD83/86 coordinates and an adjoining county may use NAD83/94 coordinates."

"Here is a convenient way to find the zone number..." "Consider west longitude negative and east longitude positive, add 180 degrees, and divide by six. Any answer greater than an integer is rounded to the next highest integer, and you have the zone. For example, Denver, Colorado, is near 105 deg W Longitude, -105deg.
-105+180 = 75 deg
75/6 = 12.50
Round up to 13
Therefore, Denver is in UTM zone 13.
All UTM zones have a width of 6 deg of longitude..."
Dec 3, 2013 11:54 PM # 
Juffy:
...and this is why coordinate systems are the bane of GIS people everywhere. (even before you add the ridiculousness of 5 different lengths for "foot") :)

It's even more hilarious when your continent is wandering North at a few cm a year, so every 20 years they go "whoops, all your projections are invalid!"
Dec 4, 2013 12:16 AM # 
TheInvisibleLog:
Lobby your politicians to go metric and also stop the drift north. The latter might be the easier option. Australia has its own geo-engineering campaign. Might be useful.
http://www.abc.net.au/cnnnn/profiteering/tiltaustr...
Dec 4, 2013 1:05 AM # 
igor_:
For UTM zones just switch coordinate system in Google Earth to UTM and you are all set.
Dec 4, 2013 1:25 AM # 
carlch:
I don't really understand the lidar processing but someday I hope too. So, thanks Cedarcreek and Terje for taking the time to make these explanations. I'm hoping that eventually it will start to make sense.
Dec 4, 2013 3:10 AM # 
hughmac4:
Hey cedarcreek: I've done a couple of biggish maps (6 and ... I think 15 PA las tiles and OSM and Philly Streets Department data) with lastools, Kartapullautin, and OCAD, and didn't have to merge anything, just processed each of the tiles with lastools, then Kartapullautin, then pulled it all into OCAD (except for the PNG vegetation background image, because I could with a single command).

What 'las processing apps' don't like skewed tiles? It's just points in a cloud ... sure the edges are skewed, but all the data is there. You just chop out the funny triangles, and if you need the data in the 'missing' areas you process the next tile(s) over, drop in OCAD, and 'Partial Map' it or something.

Or am I (probably) missing the whole point of your paragraph? :)

I WOULD (and generally do) normally lasmerge and clip (because invariably you only need a 500m strip from one tile, AND it keeps all the contours connected), but the resulting sets would have been WAY too huge for Kartapullautin to handle (32-bit RAM issue? Dunno if that's been worked around yet).

I love the sound of GIF > OCAD objects for the veg Terje! I'll have to do some comparison shopping, now that there are more than one option. Thanks for the hard work!

6-tile Wissahickon what great data from Philly!
Dec 4, 2013 7:20 AM # 
Jagge:
Pullautin handles long, skinny rectangles just fine, so one can use default tiling scheme without any offsets (in batch mode pullautin takes 100m overlap from neighbouring tiles and crops it off when the work is done). So for pullautin you will have to convert to meters/new projection and re-tile. I don't think you need to merge, you can use *.laz as input. No point processing skewed files and hand edit those triangles.

hughmac4, lots of pullautin users run out of memory long before 32-bit RAM limitations and also have 32bit system, so it's not much of an hurdle. Just re-tile, it really in't that hard, just one simple command

---

What it coems to geo-referenging. Before backround images and scanners (ocad4?) we used digitizing board and basemaps arrived on mylar in a tube. The only truth was that mylar, so we had mm grid on mylar and same mm grid in ocad. And we calibrated each piece using that grid.

Then came baground images and scanners, but we basicly did the same. There always had to be one base reference grid and we used F9 to calbrate all backround images to that reference grid (not to fit neighbouring piece of map). None of that was geo-referencing, it was just referencing and having a good solid reference. And it all was just fine and I'd say that's mostly fine even today. If it is needed one should always be able to easily geo-reference the end product with just two/three points - if the original mylar grid used was fine.

The geo-referencing makes it easier to use georeferenced base data. But it doesn't make map any better or for example make it any easier to overlay gps track on a scanned verison of the map. Even without geo referencing you should be able to use same gps calbration (like 3 reference points) for the whole map while mapping with tablet. If old (or new) maps are distorted, the reason is not the lack of geo-referenced, but not having referencing at all or using distorted reference.
Dec 4, 2013 11:39 AM # 
Terje Mathisen:
Georeferencing map data is sort of like using Network Time Protocol:

What we really need is for any pair of computers we pick for a given application/process, they should agree on what the time is.

It turned out very early that by far the easiest way to do this is by making sure that all systems can trace their time base to UTC via NTP packets, right?

Similarly, having properly georeferenced orienteering maps simply means that any random source of input data you might acquire, including cm-level RTK survey grade measurements from a mapping authority will just match up, with no need to ever "F9" adjust anything.

PS. I've been a member of the "NTP Hackers" team since the nineties. :-)
PPS. I've given up on ever having the US "see the light" and go metric, your politicians are the best that money can buy, and they will never allow it. :-(
Dec 4, 2013 2:24 PM # 
jjcote:
It's not that the US politicians don't want the metric system, it's that there's no political advantage in making it happen. Too many Americans fail to understand the advantages, and trying to get them to change gets you unelected. The only way it will happen is if corporations see enough of an advantage that they tell the politicians to make it happen and promise that they'll make sure they get elected anyway.
Dec 4, 2013 2:39 PM # 
Tundra/Desert:
Corporations are metric anyway, why would they care about the general public? (you can substitute metric for a number of other things).
Dec 5, 2013 7:01 AM # 
Terje Mathisen:
@jjcote: Thanks for confirming my ideas about US politics. :-(

Your founding fathers definitely understood that the chief (only?) job of an elected politician would be to implement the best possible policies for your country.

I've read somewhere that today a US Congress Representative is using something like 2/3 of all his/her 4-year term working to make sure to be reelected the next time, the main part of this is spent soliciting private and corporate bribes (sorry: "campaign contributions").

It would be better to have an 8-year term limit, that way they would be able, at least for the second term, to actually try to do something good, like dragging the US into an ISO/metric world. :-)
Dec 5, 2013 11:33 AM # 
jjcote:
We have a system that ensures that we are governed by people who have a demonstrated ability to campaign and get elected.
Dec 5, 2013 12:59 PM # 
Cristina:
Or, if you are a congresscritter in one of several (hundred) special districts, you only need worry about the former once, after which the latter happens seemingly automagically, term after term.

This discussion thread is closed.