Skip to content

Commit

Permalink
Rework the GETTING STARTED section of the POD
Browse files Browse the repository at this point in the history
  • Loading branch information
nigelhorne committed Nov 18, 2023
1 parent 71a7987 commit 6e3011e
Show file tree
Hide file tree
Showing 4 changed files with 32 additions and 15 deletions.
1 change: 1 addition & 0 deletions Changes
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@ Revision history for Geo-Coder-Free
0.35
Fix scantext.t when OSM is installed
Fix openaddr.t test for not/there
Rework the GETTING STARTED section of the POD

0.34 Mon Nov 6 16:53:43 EST 2023
Latest DB.pm from NJH-Snippets
Expand Down
2 changes: 1 addition & 1 deletion bin/create_db
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ set -e
# First use download_databases to download the databases, then use this
# script to build the G:C:F database in the directory $OPENADDR_HOME

export OPENADDR_HOME="${OPENADDR_HOME:-~/misc/openaddr}"
export OPENADDR_HOME="${OPENADDR_HOME:-~/etc/openaddr}"

cd ~/src/njh/Geo-Coder-Free
git pull
Expand Down
10 changes: 7 additions & 3 deletions bin/download_databases
Original file line number Diff line number Diff line change
@@ -1,14 +1,15 @@
#!/usr/bin/env bash

# Download the databases and create the local SQLite database
# Be aware that this takes significant disc space
# Be aware that this takes significant disc space and can take a day or two to complete

set -ex

export WHOSONFIRST_HOME="${WHOSONFIRST_HOME:-~/src/whosonfirst-sqlite}"
export OPENADDR_HOME="${OPENADDR_HOME:-~/misc/openaddr}"
export OPENADDR_HOME="${OPENADDR_HOME:-~/etc/openaddr}"
export DR5HN_HOME="${DR5HN_HOME:-~/src/countries-states-cites-database}"
export OSM_HOME="${OSM_HOME:-~/misc/osm}"
export OSM_HOME="${OSM_HOME:-~/etc/osm}"

# export http_proxy=http://zack:3128
# export https_proxy=http://zack:3128
export http_proxy=
Expand All @@ -28,13 +29,16 @@ mkdir -p $WHOSONFIRST_HOME
# cd $WHOSONFIRST_HOME && ./wof-sqlite-download
cd $WHOSONFIRST_HOME && ./wof-update

mkdir -p $OPENADDR_HOME

if [ -r $OPENADDR_HOME/openaddresses.sql ]; then
mv $OPENADDR_HOME/openaddresses.sql $OPENADDR_HOME/openaddresses.sql.keep
chmod 444 $OPENADDR_HOME/openaddresses.sql.keep
fi

cd $DR5HN_HOME && git pull

mkdir -p $OSM_HOME
# cd ~/Downloads && /usr/bin/wget -N -c https://ftpmirror.your.org/pub/openstreetmap/planet/planet-latest.osm.bz2
cd $OSM_HOME && /usr/bin/wget -N -c https://download.geofabrik.de/europe-latest.osm.bz2
/usr/bin/wget -N -c https://download.geofabrik.de/north-america-latest.osm.bz2
Expand Down
34 changes: 23 additions & 11 deletions lib/Geo/Coder/Free.pm
Original file line number Diff line number Diff line change
Expand Up @@ -394,29 +394,39 @@ sub _abbreviate($) {

=head1 GETTING STARTED
Before you start,
install L<App::csv2sqlite>;
optionally set the environment variable OPENADDR_HOME to point to an empty directory and download the data from L<http://results.openaddresses.io> into that directory;
optionally set the environment variable WHOSONFIRST_HOME to point to an empty directory and download the data using L<https://github.com/nigelhorne/NJH-Snippets/blob/master/bin/wof-sqlite-download>.
Before running "make", but after running "perl Makefile.PL", run these instructions.
Optionally set the environment variable OPENADDR_HOME to point to an empty directory and download the data from L<http://results.openaddresses.io> into that directory; and
optionally set the environment variable WHOSONFIRST_HOME to point to an empty directory and download the data using L<https://github.com/nigelhorne/NJH-Snippets/blob/master/bin/wof-sqlite-clone>.
The script bin/download_databases (see below) will do that for you.
You do not need to download the MaxMind data, that will be downloaded automatically.
You will need to create the database used by Geo::Coder::Free.
Install L<App::csv2sqlite> and L<https://github.com/nigelhorne/NJH-Snippets>.
Run bin/create_sqlite - converts the Maxmind "cities" database from CSV to SQLite.
Optional steps to download and install large databases.
This will take a long time and use a lot of disc space, be clear that this is what you want.
In the bin directory there are some helper scripts to do this.
You will need to tailor them to your set up, but that's not that hard as the
scripts are trivial
1. Download_databases - this will download the WhosOnFirst, Openaddr,
scripts are trivial.
1. mkdir $WHOSONFIRST_HOME, cd $WHOSONFIRST_HOME, run wof-clone from NJH-Snippets.
2. Run bin/download_databases - this will download the WhosOnFirst, Openaddr,
Open Street Map and dr5hn databases.
Check the values of OSM_HOME, OPENADDR_HOME,
DRD5HN_HOME and WHOSONFIRST_HOME within that script,
you may wish to change them.
The Makefile.PL file will download the MaxMind database.
2. create_db - this creates the database used by G:C:F.
It's called openaddr.sql,
The Makefile.PL file will download the MaxMind database for you, as that is not optional.
3. If you have done step 2,
run bin/create_db - this creates the database used by G:C:F.
If not, ignore this step and go to step 4.
The database is called openaddr.sql,
but that's historical before I added the WhosOnFirst database.
The names are a bit of a mess because of that.
I should rename it, though it doesn't contain the Maxmind data.
3. create_sqlite - converts the Maxmind database from CSV to SQLite.
Now you're ready to run "make".
See the comment at the start of createdatabase.PL for further reading.
Expand Down Expand Up @@ -456,6 +466,8 @@ The OpenAddresses data doesn't cover the globe.
Can't parse and handle "London, England".
The various scripts in NJH-Snippets ought to be in this module.
=head1 SEE ALSO
L<https://openaddresses.io/>,
Expand Down

0 comments on commit 6e3011e

Please sign in to comment.