Skip to content

Roadmap

Loki The Great edited this page Apr 8, 2021 · 4 revisions

This page contains general planned features and the overall purpose of this project. More to be added as the scope gradually creeps upwards.

  1. Create a series of scripts that create and load small lookup tables and codes that remain relatively static

The government uses a variety of GIS marking identifiers for identifying geographic features. The most common are the GNIS database and the FIPS database. Also there are a series of smaller units of data identifying additional codes their data products contain. Additionally for doublecheck I have also found an independent us cities list geocoded to their boundary center. A note about the census files, is that the data places are not always towns, they are sometimes closed communities, prisons, and other institutions.

  1. Create a series of utilities that read government provided Dbase and Shapefile pairings.

Most all of the shapefiles are indexed by a DBASE 4.0 database, providing (most of the time) a few other useful tidbits of information, a unique identifer for each region contained, and other naming, quantifying and classification fields. These records are contained in the same order as their corrresponding shape data in the files. The filenames can sometimes be split to retreive FIPS code identifers that can be linked against the FIPS database indicating the major region the shapes are contained in, such as most frequently, state.

  1. Create a utility that will plot paths of movement

I have considered heavily modifying the A* pathing algorithm to consider human limitations, load, average walking speed, weather conditions and human needs along with things like traversible and insurmountable boundaries to help the stranded person find their way to another location with as much convenience and safety as possible, along with possible sites they can sleep in if needs be where they will likely not encounter other humans. In this I was hoping to incorporate the species data as well, insect and parasite data would also be nice and pollution samples from various points in rivers and streams the location of polluting manufacturers as well to advise on gear but we'll start with 'get from point A to point B without encountering a mountain chain, gorge, river, wetland, unpassable highway, dying of thirst or hunger or being eaten by an f-ing wolf!'

  1. Create a utility which will section large uncompressed Erdas Imagine files into smaller compressed tiles that can be pulled into collections of data being retrieved from the server.

I have questions about this, I believe that programs like QGIS are able to display scaled image data so quickly because of the method the files are READ, thereby simply allowing them to scale by exclusion, thus realllly picking up the pace. If png's can be similarily read there shouldn't be an issue, but simply put need to find a compressible format that can be scaled in such a way to allow this. The key point of "compressible" creates a small issue with this logic because if compression is row based that might be possible, if its file based there is a large issue. I'll have to play around with some ideas. At the moment for space conservation, 20GB per file is a tad on the large side, however it does amaze me just how fast programs like QGIS can locate the proper tiles and render them at speed, even rendering the WHOLE file when the viewport is scaled down. Obviously also a pathfinding algorithm is going to have to have some form of look forward/back functionality to ensure a course isn't plotted MILES off.

  1. Create a series of Data Access classes that form a programmable API to be accessed from C# code.

This will be a pretty simple thing to do as I've done this many times in my professional career. However some considerations will be how the data is being retrieved, hence the model needs to be finished first before the access functions are created. I'll be using the standard model of create stored procedures and the using a set of ado.net utilizing classes to access them and populate lists of data objects.

  1. Create a series of data organization scripts that will create linking tables that specify foreign relationships between various data elements allowing quicker retrieval of subsets of point and shape data and other descriptive elements without having to search ad hoc, thereby limiting data crunching to pulling the intersecting co-occurring point data back, instead of just searching for every element every time.

This is going to include everything from linking elements against each other by their ordinary organization, state, locale, etc, but also charting out what specific shapes intersect with each other in advance and creating many to many links on their, as well as the raster tiles that will be alotted.

  1. Integrate data about food stores and snap accepting facilities.

The usda puts out decent information regarding food stores and snap accepting stores. Locating a free list of retailers and restaurants has not been so easy. Google gathers this info because EVERYONE wants found on google, there may be address data in the department of state for each individual state. With a little tinkering these addresses could be converted into rooftop geocoded ones. There is some question however as to what data this will provide. I don't really need to know the locations of corporate offices for McDonalds or Walmart, I need to know where the stores are located, everything from malls to restaurants and general retailers. The other issue may be that in the case of franchises a not very helpful name will pop up, like "Bob and Martha's LLC", which tells me absolutely nothing !

  1. Integrate a plant and edibles database to help with foraging efforts.

The databases I have found are not terrible, but they require some tinkering to be parseable. They also would need paired with allergen data as if you eat something you're allergic too in the middle of nowhere.. well.. lets just say that would suck. To use this data or for this to be useful a person would have to spend some time experimenting close to home. However knowing you can eat the inner bark of certain types of fir trees, when you're in a fucking fir forest with no goddamn berry bushes, is kind of helpful information !

  1. Plot safe routes that maximize the average vagabonds travel near resources and away from pests, wetlands, bypasses, obstructions and hazards.

I need to locate a decent crime database to add to this. The goal is to keep a person NEAR civilization as far as possible, out of the sun etc. Human hazards are included in this.

  1. Provide a series of tools to manage downloads of various government provided resources that they haven't updated for some reason, to determine a series of averages to help plan for inclement weather and temperature ranges by region and adjust daily expectations of moving.

Some of the data uncle sam offers for download is annoying to get to. So I have devised a few browser automation tools that will stay within his good graces and not spam his sites so that you dont have to sit and click for fucking hours, but don't go over the download limit and get banned. The climate data is most especially annoying to get at, and even more annoying to parse. Cloud cover on average and rain storms, temperatures etc being kind of important data if someone is wandering around outside !

  1. Create a series of coordinate transformations

I have to decide whether I want to use Conus Alber NAD83 natively in the project, there are real advantages to this. However then if someone wants to visually reference the points the standard sexagesimal ( I always heard segagesimal but ok) WGS 84 / EPSG 4326. The upside of conus albers nad83 is that the coordinates are based off actual distance from reference points, whereas sexagesimal needs special calculations to determine distance between given the way the earth is actually shaped. I'm leaning heavily towards NAD83, but then I need to add additional logic and include additional data files. There is some difference between how different agencies store coordinate data. The species data from the USGS, infuriatingly, was my first exposure to conus albers, and since my parser works perfectly, I found myself wondering 'MAN WTF !' when I was receiving huge numbers well outside the range of standard coordinates. The Projection Library (proj) is included in GDAL, so the functions are already there even if they are annoying to access, for converting between one coordinate system and another.

Of note, garbage chomos like John Zimmerman seem very interested in NOT allowing projects like this to move forward since his compatriots would misuse this information, unfortunately that fucks the rest of us as well, and dooms humanity to being slowly ruined by this mound of rotting garbage which inhabits Illinois, Colorado, etc.

Clone this wiki locally