Tuesday, December 11, 2012

Debugging maptiler on Mac

Running maptiler on Mac is no easy task, as I've found out.

While there are a few stumbling blocks when it comes to dependencies, one that I found particularly perplexing is the bundling of a two Python executables in the .app package.  The maptiler executable is written in such a way as to not only to run the rest of the package through one or other of the Python executables, but to choose based on 32 or 64 bit architecture.  That's independent of the version of Mac OS being run, or the "current" Python.  Most dependency installers will install relative to the "current" Python, and its path.  Therefore, on the 32-bit MacBook Pro I was testing on, I was being kicked on to the bundled Python 2.5 executable, and missing all my dependencies.  This caused various dependency errors, including "ImportError: No module named osgeo"



Here is are the important diffs in maptiler.app that caused it to work: in ../MapTiler.app/Contents/MacOS/maptiler 6,9c6,9 < #if sys.maxint > 2**32: < executable = os.path.join(execdir, "PythonSnow") < #else: < # executable = os.path.join(execdir, "Python") --- > if sys.maxint > 2**32: > executable = os.path.join(execdir, "PythonSnow") > else: > executable = os.path.join(execdir, "Python") and in ../MapTiler.app/Contents/Resources/maptiler/pp/__init__.py 141,147d140 < # fix PYTHONEXECUTABLE dict error < < if "PYTHONEXECUTABLE" in os.environ: < executable = os.environ["PYTHONEXECUTABLE"] < else: < executable = sys.executable < 156c149 < command = "\"" + executable + "\" -u \"" \ --- > command = "\"" + os.environ["PYTHONEXECUTABLE"] + "\" -u \"" \
 
Relevant link at Google Code: http://code.google.com/p/maptiler/issues/detail?id=42

Friday, December 7, 2012

My Places Overlays on Maptiler Google Map

Maptiler is a good option for creating image overlays (such as historical maps) on Google Maps.  There are some snags though -- the code is still using gmaps API v2, which will give some indication for the attention being given to the project right now.  

 The problem I ran into was adding overlays (such as a KML or GeoRSS feed) on TOP of the image overlay, and correct behavior with the opacity slider built into the maptiler output.

Here are the simple edits to fix this issue.

  1. Added a My Places KML overlay by copying the KML URL from My Places and these two lines (I put those at 185, just below the construction of the new map object)
  2.  geoXml = new GGeoXml("https://maps.google.com/maps/ms?authuser=0&vps=1&ie=UTF8&msa=0&output=kml&msid=204904299571298064529.0004d01c986fdc0fe9f19");  
     map.addOverlay(geoXml);  
    
  3. Changed zPriority of maptiler overlay so that it sits below other overlays (I edited the line at ~224 which constructs the "overlay" object
      overlay = new GTileLayerOverlay( tilelayer,{zPriority:-1000});  
    
  4. Finally, I fix the opacity slider by changing the line that removes all overlays to just remove the maptiler overlay, and reinitilize it at zPriority -1000 around line 62
     this.map.removeOverlay(this.overlay);  
     this.map.addOverlay(this.overlay, { zPriority: -1000 });  
    
  5.  
    That's it!

Hit the geocoding wall?

Google's Geocoding API limits results to 2,500/day.  So what are your options when you hit a geocoding wall with a public/free API?
  1. Using ArcGIS (or alternatively Gisgraphy, which I have not yet experimented wtih http://www.gisgraphy.com/) you can geocode against the best data you can get your hands on.  Esri Streetmaps Data is great, if you have access (as we do at UD).  Or you could download public data, like the Census TIGER data, geonames, or an openstreetmap extract for the area you're interested in, and geocode on these. There is some high quality dataset from cities and states.  In the case of Delaware,  which licenses TeleAtlas (now TomTom) data, I don't think non-state users are permitted to use it.  I have not seen a way to do this in open source GIS software (e.g., QGIS), which is why ArcGIS or Gisgraphy would be required
  2. Mapquest/Nomatim (http://developer.mapquest.com/web/products/open/nominatim) and  Cloudmade  have an unlimited geocoding APIs on OpenStreetMap data.  Sometimes that can be slow or inaccurate though.
  3. Mapquest (non-Nomatim) and Yahoo also have a 5000 result/day limit ... a 2x improvement on Google

Wednesday, November 14, 2012

NOAA/NCDC CLIMAPS #2 Climate Atlas of the US

I had previously written about a strange projection in the NOAA/CLIMAPS data, and am now following that up with a strange projection in the Climate Atlas of the US data.  This data comes to the end user with no spatial reference or documentation on disk which describes the spatial reference.  However, I found a helpful page on the climate atlas website which describes the reference: http://www.ncdc.noaa.gov/oa/about/cdrom/climatls2/faq.html#SPHEROID.  This reference, however, is not a common one.  A similar projection is available in ArcGIS though, and by modifying the reference latitude (i.e., latitude of origin), we can apply the new custom projection to these files by "define projection".  Here is a link to the custom projection file.

Web Maps for Digital Humanties

As I'm currently working with two digital humanities groups on web projects, I have been thinking about easy/free ways to get humanities, particularly historical, resources on the web.  The heavier weight self-hosted option we're using is Omeka, and the lighter weight-cloud hosted option is ArcGIS Online.  Omeka provides the kind of exhibits, metadata, and other functionality often crucial to scholarly digital humanities publications, while ArcGIS is more suited as a secondary product or as a class project -- with a significantly fewer adoption costs.  On Omeka, we're using the Neatline and Neatline Maps plugins, as well as Geoserver.

Here's a good workflow for Omeka/Neatline Maps/Geoserver:
  1. Download or scan a (historical) map image
  2. You will need to use GeoTiff format to work with Neatline Maps. I used ImageMagick to convert your format to TIFF.  We'll worry about georeferencing (the Geo in GeoTIFF) in the next part.  You'll need to use the command line IM tool -- "convert" is a built in Windows system executable, so you'll need to specify the path when you execute from CL, if the IM binary folder is not on your system path.
  3. Now that you have your TIFF, georeference with ArcGIS Explorer Desktop and the Georeferencing Add-In.  Explorer is freely available.  You'll find links to download Explorer and the Georeferencing Add-In, as well as instructions for doing the Georeferencing, here: http://blogs.esri.com/esri/arcgis/2010/12/17/georeferencing-add-in-for-explorer-desktop/
  4.  After you've gereferenced and saved your TIFF, import it into your Omeka/Neatline Maps site (Add an Item > Files > "Add New Files" and browse to your TIFF ... "Add Item"
  5. Your image should now be added to Geoserver and is provided back to NL Maps as a WMS.  Though NL Maps is designed to add the reference to the WMS, we identified a bug that prevented this from occurring.  At this time you will need to add the reference directly through the Edit Item dialog in the Web Map Service tab.  You will need to populate the WMS Address and Layers fields in this tab, in this format:
    1. WMS Address: http://HOSTNAME:PORT/geoserver/NAMESPACE/wms
    2. Layers: LAYERNAME or LAYERNAME1, LAYERNAME2
  6. Next you must create a NL Exhibit through the Neatline tab at the top of the Omeka admin interface.  Once you have created the exhibit, you must Edit Query to include the map image.  I found that the easiest way to get the item you want int the query is to Narrow by Specific Fields using the Title field, with the value of the map image title.
  7. Now, when you edit the exhibit, you should see the map image displayed at the proper location.  First, I panned and zoomed to the location of my image.  I immediately changed the opacity of the image via Map Settings > Shape Opacity within the neatline exhibit editing interface.  Next, I fixed the starting point of the map with Layout Editor > Fix starting viewpoint position, within the same inteface.  I saved these changes subsequent to making them, and then refreshed the interface via the browser to see the changes.  
  8. Finally I could view my NL Maps Exhibit through  http://SITEURL/neatline-exhibits/show/NLEXHIBITNAME

Wednesday, October 17, 2012

ArcGIS Online Terms of Service, Notes

With the plethora of cloud services being offered these days for hosting and analyzing geospatial data, it is important to know what we're agreeing to.  Unfortunately "the fine print" is still relatively dense.    With ArcGIS.com (ArcGIS Online) opening up their "subscription" service level to their higher ed site license clients, and more of our clients interested in hosting their research, I decided it was time to take another look at the fine print.

Here are some of the main points I took from the License Agreement (E204 06/24/2012) ... you'll want to focus on Addendum 3 (E300-3) which covers Online Services.  These notes should not be taken as legal advice.  I would recommend you check out the license agreement yourself at http://www.esri.com/legal/software-license

Good Stuff

  1. Intellectual property -- it looks like Esri doesn't take ownership of any content.  It remains property of the licensee, the university in this case, and Esri only accesses for purposes of providing the service.
  2. Common sense: If you share your data or grant others access to edit it, they might use it or might alter your service.  You may want to adopt your own use restrictions, published alongside your data.
  3. After the license is terminated (e.g., if the university decided to stop our relationship with Esri, which would be a big deal) the licensee is granted a period of 30 days to retrieve their content -- after which Esri has no further obligations.
Bad Not Good Stuff
  1. The licensee (the university) has a limited amount of service credits.  Many of the functions of the subscription (e.g., tile caching and hosting) cost "service credits".  Once we approach 75% of credits consumed we will be notified.  Once our credits are used our subscription services will be suspended until we renew our credits.  For that reason the administrator (me) will be extra careful about who I grant access to use functions that consume credits.
  2. If you are using an evaluation subscription you must make sure to export your content before the evaluation ends, or it will be permanently lost.

Tuesday, September 11, 2012

pgAgent Stumbling Blocks

In addition to doing scheduled backups from Postgres, we've also needed to load nightly dumps sent via scp to our database server's filesystem.  I finally got pgAgent working, but it wasn't without much consternation.  If you need to do this sort of thing on Windows, I'd recommend using Task Scheduler and a batch file if pgdump will work (though it likely won't if the file is coming from Oracle), or Task Scheduler with pgsql if it won't.  If you need to use pgAgent, here are some stumbling blocks that I found:


  • add pgpass.conf to C:\Users\postgres\AppData\Roaming\postgresql
    • should be localhost:5432:*:postgres:PASSWORD 
    • (note, this is not under the default user, it's the "postgres" user on your system) 
  • may also need the following in C:\Program Files (x86)\PostgreSQL\9.0\data\pg_hba.conf (but don't think so) 
    • local all all trust
    • host all all XXX.XXX.XXX.XXX/32 trust 
  • reload configuration *running as admin* 
  • install pgAgent via application/stackbuilder (*run as admin*) 
  • make your steps ignore error, for some reason they error even if they complete successfully and stall subsequent steps

Tuesday, July 31, 2012

NOAA/NCDC CLIMAPS

National Climatic Data Center's Climate Maps of the United States database offers polygon data for many popular climate variables.  We would like to use this data for preliminary visualizations on bridge decay research -- the primary benefit to this data being its simplicity (it's already averaged by day/year over a 40 year period), national continuity, and file format (shapefile).

However we encountered two issues with the data.

Issue #1: it's based on a funky datum "GCS_NAD_1927_CGQ77", which addresses some issues in Quebec with NAD 1927.

We were able to side-step this issue by just re-defining the projection as something more common ... in our case we just went right to WGS 1984, since we wanted to view in Google Earth, and since a rough visualization was our immediate purpose.  I suppose NAD 1927 would have been fine, since we weren't doing anything with Quebec.  BTW, there are no readily available transformations of the  GCS_NAD_1927_CGQ77 datum.

Issue #2: the data is not continuous or even numerical

Values are expressed as "0 - 5 in", so we have no way of redefining what values are grouped in what colors, and since it's polygon data we also lose the ability to redefine spatial grouping based on new intervals.  That's not a big problem for this first rough visualization, but we will need to find more fine-grained data for subsequent analysis.  We'll probably end up taking point data from other NCDC sources and doing our own interpolation and temporal reduction.

Thursday, May 24, 2012

ArcReader 9/ArcGIS 10 Conflict

I've seen a few folks who've had trouble installing ArcGIS 10 because they've been unable to uninstall ArcReader (usually version 9.x).

The best way I've found to do this is to download the ArcReader installer from here.  Run the installer and try to uninstall.

If you're unable to do that (sometimes you'll need to force quit and restart) you may need to run an install, but uncheck the ArcReader feature.  This gets rid of some hanging installer memory.

Then try to run a repair (restart if necessary) and start the downloaded ArcReader installer ... select repair when prompted.

After you've repaired you should be able to uninstall correctly.

If that doesn't work, you're stuck deleting all ArcReader files and ArcReader (and ArcGIS/Esri) references in the registry using regedit.

Friday, April 27, 2012

Delaware GIS 2012

The Delaware 2012 GIS conference was held yesterday.  It was a great opportunity to meet and learn from our colleagues in Delaware and the region.  I presented on Geospatial Data in Cloud and High Performance Environments, which you can access here: http://udel.academia.edu/BenjaminMearns/Talks/81729/Geospatial_Data_in_Cloud_and_High_Performance_Computing_Environments

Monday, April 16, 2012

Lewes Wind Turbine

I took a trip to the UD Lewes Wind Turbine on Thurdsday ... I posted a few paragraphs and a VIDEO to Doug White's Blog

Wednesday, March 21, 2012

ENVI/IDL and ArcGIS/Python integration

I occasionally answer questions on ENVI/IDL and ArcGIS/Python integration.

ENVI functions are available as "tools" through the ArcGIS GUI toolbox:

"With ENVI 4.8, ENVI’s scientifically proven image analysis capabilities will be available as discrete tools in the ArcGIS toolbox and accessible directly from ArcGIS desktop and server environments. Upon installation of ENVI products, a selection of powerful image processing and analysis tools will be available within ArcGIS."

A list of those tools is here (in the context of licensing, but just ignore that): http://www.ittvis.com/ProductsServices/ENVI/ToolsLicensing.aspx 




IDL can also be accessed for extension of ENVI/IDL functionality:

"developing customized ENVI tools can be done using simple IDL scripting and wrappers that facilitate the integration of ENVI tools with the ArcGIS platform."

Here's a whitepaper: http://www.ittvis.com/portals/0/whitepapers/ENVI_Tools_ArcGIS_Server_WP_Tutorial.pdf





Thursday, March 8, 2012

Configuring svn on Apache

I followed this page to install/configure svn on my server ... it's written for Windows-based Apache install, so I've summarized and transcribed the following platform-anostic (and maybe overly general) steps:

  1. make sure dav_module, dav_svn_module, and authz_svn_module are installed on Apache (assuming you're using webdav to do svn)
  2. create an (arbitrarily named/located) svn config file (maybe called subversion.conf) and include in httpd.conf
  3. create an (arbitrarily named/located) auth file (maybe called svn-auth-file) with the htpasswd binary that comes with Apache (should be under apache/bin), using -cm flag for first user and -m for each additional
  4. create an (arbitrarily named/located) acl file (maybe called svn-acl) and assign permissions to users created in the previous step to groups and "projects" (directories) following this format:

# specify groups here
#
[groups]
team1 = ross, rachel

#
# team1 group has a read/write access to project1 repository
# all subdirectories
# all others have read access only 
#
[project1:/]
@team1 = rw
* = r

#
# project2 repository, only harry and sally have read-write access to project2
#
[project2:/]
harry = rw
sally = rw
* = r


4. inside the subversion.conf create a virtual directory using the following format, including the acl file created in the previous step

<location project1="">
  DAV svn
  SVNPath C:/Repositories/project1

  AuthType Basic
  AuthName "Subversion Project1 repository"
  AuthUserFile c:/etc/svn-auth-file

  Require valid-user

  AuthzSVNAccessFile c:/etc/svn-acl 
 </location>

Tuesday, February 28, 2012

ArcGIS Online CSV Hosting: It's Not All There

ArcGIS Online allows a user to upload CSVs, but as we found out, this feature provides less extensive sharing capabilities than other comparable cloud hosting services.

The documentation describes CSV hosting.  It is possible to upload the file for others in a group to download, but it seems that the ability to share the CSV, a link or as something that could be added to multiple maps, is only available when a premium subscription is activated.

In lieu of a premium subscription, an excellent option is to host the CSV with Google Docs and share the URL with relevant parties.  They can then add the CSV to their invididual ArcGIS Online maps, as this blog post shows.  The downside with this approach is that it does not integrate with your group as set up on ArcGIS Online.  You will also need to publicly share the CSV from Google Docs, since that is the only way you can expose a URL (without some kind of additional programming).  Thus, the downside to this approach is for mostly for organizations that have particular data sharing requirements.

Tuesday, February 21, 2012

Operationalizing SDE Subsetting/Layer Registration

Q: How to get max latitude for each given longitude in a time period subset of points as a GIS layer?

The particular dataset we were working with already had X and Y coordinate fields and was in shapefile format.  After attempting to use ArcGIS tools, including those that exposed the limited SQL that ArcGIS offers without SDE, we found that SDE was necessary.  We had already set up a SDE Postgis/Postgres (PG) database, so we could easily import the shapefile. 

First we wanted to create a subset of max latitudes in PG, through the preferred PG interface (ours was pgAdmin III).  Where max is the max subset and data is the original data:
SELECT * INTO sde.max FROM sde.data Where y in (SELECT max(y) FROM sde.data GROUP BY x); ALTER TABLE sde.max OWNER TO sde; GRANT ALL ON TABLE sde.max TO sde;
.  Notice the use of the 'sde.' schema prefix and the permissions queries, which we found necessary (see stumbling blocks below).  We operationalized this by semi-colon delimiting and changing table names where necessary.  This could be made much more efficient by getting a list of tables with the name format given to the tables that we wished to subset, but we were more interested in getting a result than more completely operationalizing at this time.

Next we needed to register these new tables as SDE layers, so that ArcGIS could "see" them.  After this step, the tables behave as normal ArcGIS feature classes.  Before this set, we could not even get properties of the tables, let alone interact with them (an error message would be displayed).  We ran the following command to do this through Windows cmd (SDE bin must be on the Path):
sdelayer -o register -l max,shape -e npc -C objectid,sde -i sde:postgresql:localhost -D sde -s localhost -u sde -p sde -t st_geometry
  .  We operationalized this by repeating this in a .bat file and changing tables names.  Again, this could be made more efficient by using an sde command to get a table list and looping through all tables that had names which matched the max subset table name format.

Stumbling Blocks
  1. The most difficult problem we ran into was the "DBMS table not found (-37)" error, which apparently is quite common for a variety of reasons with SDE.  This ultimate came down to needing to make sure the table was stored under the sde database, in the sde schema, and that the table was owned by sde.  I had assumed that making the owner of the database sde would cause new tables to be sde owned by default, but this was not the case.
  2. You may need to update your SDE/ArcGIS Desktop to 10 SP3.  This was mentioned in some forums with the above error and with other errors around not being able to see tables in an SDE DB.  To do the update, you need to download/install SP3 on desktop/sde and then run "update database" from the database properties under ArcCatalog.  However, you must create a direct connection to do the update ... update does not work under the normal ArcCatalog Spatial DB connection.  
  3. General SDE recommendations:  My rule of thumb is"anything you can do in arcgis/catalog, do there ... all else do in pgadmin".  Also remember to refresh after all steps if there's some result you're trying to see.  pgAdmin and ArcCatalog both need refreshes before showing updates.

Friday, February 3, 2012

ArcGIS on a High Performance cluster: Part 1, Linux

Now that we have our new community cluster running at UD  it's time to learn how I can optimize GIS software for that environment.

The first hurdle is operating system

ArcInfo Workstation used to be a great way to run Esri/ArcGIS geoprocessing tasks on *nix boxes.  However, it seems the last version of ArcInfo Workstation that ran on Linux was 9.1.  I'd contacted Esri about obtaining a copy of 9.1, but apparently it is out of production and they do not have any copies of the software that they'd be able to send to me.

So basically the workstation/desktop route is out not available at this time.  But there is more than one way to skin a cat: ArcGIS Server 10 (AGS) runs on Linux.

I've recently seen documentation which suggests geoprocessing models can be leveraged by publishing them as geoprocessing services through ArcGIS Server.  I have heard at the Esri conference, and in some documentation that success has even been reported in distributed tasks, such as building caches, through the SOM/SOC architecture that is available out of the box in AGS.  Could this architecture be extended to distribute geoprocessing tasks?

Taking a different tack, AGS exposes the geoprocessing object through a Python wrappers.  That means that we should be able to programatically run our software on Linux through Python.  Python also has wrappers or libraries for multithreading and MPI (distributed), so the implications for taking advantage of our cluster are especially exciting. 

Note: There are some differences to be expected, such as with file path conventions and name lengths. 

Next: Multithreading

http://support.esri.com/en/knowledgebase/techarticles/detail/31903

Wednesday, January 25, 2012

The Atlas of Visual Complexity

Very cool dynamic cartograms for visualizing economic complexity: atlas.media.mit.edu

Thursday, January 19, 2012

Surface Climate Data

I am consulting on a research project involving bridge decay. For this project we needed to locate a number of surface atmosphere chemical data (some sulfure oxides and chloride) as well as data on precipitation, snowfall, and min/max temperature.

Here are some sources we found:

Tuesday, January 17, 2012

NetCDF

NetCDF is a format well suited for environmental research given the inherent multi-dimensionality of the environment and environmental research topics. 

I have not worked much with NetCDF data, and recently ran into some trouble importing gridded bathymetric data sets (GEBCO) into ArcGIS.

When importing into ArcGIS I attempted to use Multidimension Tools > Make NetCDF Raster Layer.  This tutorial from Esri describes this procedure.  Unlike in the tutorial, default values were not populated for tool parameters, and as I was unable to ascertain these values, I could not run the tool.

My next option was to learn more about the data set from GDAL's gdalinfo command.  After recompiling with the netcdf libraries, I was able to run gdalinfo on the data set, but it returned a rather sparse result that did not illuminate any of the parameters that I needed.  This contrasts with the result show in the GDAL documentation for netcdf.

I then attempted to load NetCDF directly into ENVI and IMAGINE -- common geospatial image proecessing programs -- I could not load it successfully into either of these.

I was somewhat more successful with IDL.  IDL is able to read NetCDF and report on its structure. However, the commands for interacting with the results objects are unwieldy at best.  This is an option that requires more investigation.

The GMT use NetCDF as the native package format, and GEBCO and others seem to base their NetCDF format off of GMT's specifications.  The grdinfo command under GMT gives some information about the data set, but still nothing that can be used in ArcGIS.  This also requires more investigation, as there are many commands for working with NetCDF.

Finally the data set was able to be opened by GEBCO's specialized display software (which is produced by the data provider).

I will be adding more to this post as I learn more about NetCDF and perhaps the peculiarities of this particular data set ... why use open standards if you need a data provider created tool to open the data??