Getting Started

 
Download the 32-bit or 64-bit version of the Connectivity Analysis Toolkit from www.connectivitytools.org. If you have a 64-bit Windows operating system, you can install either version. You may want to install the 64-bit version if you have more than 3GB of RAM.
Although most analyses can be performed at acceptable resolutions with the 32-bit version, the 64-bit version may be preferable if you are attempting  analyses at high spatial resolution over larger extents (resulting in a graph with large numbers of nodes)(see System Requirements).
 
Installation
 
The Toolkit is supplied as a self-extracting installer. Run the installer and either select the default options or choose a custom installation directory. You will want to copy the 'Tutorial' directory to another location on your system in order to work through the tutorial analyses. Installation of the Toolkit may require administrator privileges on some systems. After installation, start the Toolkit from the Start Menu or Desktop shortcut.
 
Workflow
 
Typical workflow begins with import of an ASCII (.asc) file created from habitat data in a GIS.
The Toolkit processes this raster data into a vector format (a shapefile) and associated hexagon file (.hxn, a file format used by the Toolkit) and node coordinate file.
The hexagon file is then used to produce a textfile representing a graph which has a node for each of the hexagons in the shapefile.
The graph data is then written in one of two 'graph file formats', the edgelist format accepted by the NetworkX module or the LEMON graph format (lgf) required by the LEMON module (see below).
The user then performs one of several types of centrality analysis on the graph and produces output giving node centrality values.
This output file can then be joined to the shapefile using a GIS software such as ESRI's ArcGIS.
 
Note that graph format files created in another application (e.g., as representations of patch networks rather than landscape lattices) may also be analyzed using the Toolkit's 'Connectivity' tab.
 
Why can't this all be integrated into a GIS software?
 
While modern GIS software such as ESRI's ArcGIS are powerful tools, and ArcGIS in particular is increasingly integrated with scripting languages such as Python, important limitations remain. for example, the Toolkit's graph analysis algorithms require highly optimized versions of Python's numerical libraries that may conflict with the versions used by ArcGIS.
 
Why can't this all be done without GIS software?
 
We have focused on creating a Toolkit with advanced connectivity analysis methods that are not available in GIS software, rather than trying to duplicate GIS functions available in other software. We assume users will have access to software such as ArcGIS to create .asc files, identify source and target areas for subset centrality analysis, and perform post-processing of output, such as joining centrality output files to the shapefile.
 
But the Toolkit's methods may take hours, whereas I can map corridors in a GIS in minutes. Why is it worth the trouble?
 
We make the case in the remainder of this manual that the Toolkit's connectivity analysis methods can complement methods commonly available in GIS, and allow planners to better evaluate alternate assumptions on how to represent wildlife movement and ecological processes. Land-use planning for biodiversity conservation necessarily involves decisions that have large economic and social impacts, and thus merits use of the most rigorous and informative tools available. The analyses described in this manual, although they do sometimes require substantial computation time (see System Requirements), would not have even been possible on desktop computer systems as recently as 2 years ago. Versions of the software starting with version 1.2 include approximation-based algorithms that greatly increase the speed of analyses when compared to earlier versions of the software. The scope and resolution of analyses feasible with the Toolkit's methods will continue to expand over the coming years, as the algorithms underlying these methods are under active development.

The help manual was created with Dr.Explain