This task was interesting, it was required to improve the workflow for tracking the locations of fishing equipment at sea in areas where there are other operations ongoing.
Create ESRI shapefiles for unique locations for each piece of the fishing equipment logged within a specific time period
Emails were sent to a specific address with positional information relating to each piece of equipment in the water
Each email might contain the information in PDF, CSV or embedded in the email text
The data was not consistent e.g. the positions that were received in the emails could be stake and not update in a long period of time, they could also be some updates in one email and other updates in the next email
The positions were all supplied using a WGS84 CRS
I wrote a small web service using Perl that would scrape the email inbox via IMAP and it would look for emails for specific senders.
Once an email of interest is spotted, any attachment on the email is extracted, if the email body contains the data with the positions then it is extracted to a CSV file
The email scraping service only considers emails that have arrived since the last time the inbox was queried
The positions are then parsed and the positions are inserted into a small SQLite database
At a specified interval, the web service queries the SQLite database to extract the positions that are valid (not duplicated and not too old)
The Perl service then calls ogr2ogr to convert the extracted positions into ESRI shapefiles
An email is constructed and the ESRI shapefiles are attached to the email, in the email body, a summary of the shapefile contents is available to let the end-user know what the status is prior to loading them into any other software.
After the system was implemented it was used for a period exceeding 6months without any issues