⚠️ The HITO Database is neither used nor maintained anymore. It was only used together with the Database Frontend, which is archived as well.
This repository has been archived on 2022-05-16.
Populate the HITO PostgreSQL database and export it back to RDF. Contains software products and related attributes, such as licenses
-
Append to
~/.ssh/config
:Host hitotunnel Hostname datrav.uni-leipzig.de ProxyJump star LocalForward 5432 localhost:55432 ControlMaster auto ControlPath ~/.ssh/sockets/%r@%h:%p User root
-
Open tunnel via
ssh -fN hitotunnel
-
When finished, close tunnel via
ssh -S ~/.ssh/sockets/[email protected]:22 -O exit hitotunnel
Import data from the Virtuoso SPARQL endpoint into the PostgreSQL database in two steps:
./download
./import
The download
script converts data from the SPARQL endpoint to .SQL files.
The import
script executes the SQL statements within those .SQL files on the HITO database.
Warning
./import
deletes the complete database without confirmation and replaces it with the new data!
- bash
- Python 3
- psql
- tunnel from the HITO database to localhost
- HITO including the Software Ontology.
- DBpedia
Export data back from the database to the SPARQL endpoint.
- bash
- Ontop with the PostgreSQL JDBC driver
- tunnel from the HITO database to localhost
Copy scripts/export/hito.properties.dist
to scripts/export/hito.properties
and add the database password.
cd diff
./prepare
./compare
./diff
Or even better, instead of ./diff
run vimdiff export/output/all.ttl /path/to/my/ontology/swp.ttl
from the base directory and then you can directly push the changes into the ontology repository.
When the ontology repository is updated, go on the server, run git pull
in the ontology
directory and then in the docker directory:
docker-compose down -v
docker-compose build --no-cache
docker-compose up