hoborequest 0.4

Creator: bigcodingguy24

Last updated:

Add to Cart

Description:

hoborequest 0.4

HOBO-request
A Kubernetes-ready container application for integrating HOBOlink loggers with HSDS. This application
was developed in partnership with the HDF Group with NSF support for the UVA-ARC project (grant #2022639).
Getting started
HOBO-request provides an interface to HOBOlink loggers via a public API that is provided by Onset.
It can be used with HSDS for the purposes of large-scale environmental data analysis, or as a stand-alone
tool for the purposes of testing. It can also be easily extended to be used with different APIs and sensor
networks.
You can install the HOBO-request package or git clone this repository. For installing the pip package,
just create a virtual environmental and run:
$ python3 -m venv hobo-request
$ source hobo-request/bin/activate
$ pip install hoborequest

You can also take the development path way and work directly with the source code.
In order to integrate HOBO-request with data loggers, a few libraries / modules need to be installed.
You can proceed by installing the dependencies described in requirements.txt.
$ python3 -m venv hobo-request
$ source hobo-request/bin/activate
$ pip install -r requirements.txt

HOBO-request is now ready to be used, but first it must be configured!
Configuration
First, copy conf/hobo-connect.conf-SAMPLE to conf/hobo-connect.conf.
Then, edit 'conf/hobo-connect.conf' and set the following parameters:

user_id: your Onset user ID
client_id: your Onset client ID
client_secret: API secret key obtained from Onset
start_date_time: for precise time slicing, the start time in UTC, format: 2012-01-01 23:00:00
end_date_time: same as above, the end time in UTC, format: 2021-01-12 11:00:00
polling_interval_minutes: if you want to set a regular polling interval, leave start & end time fields empty.
Alternatively you can also set a start time, but leave end time empty

Please observe that timestamps are always assumed to be in UTC. For the purposes of data storage, it is more
convenient to avoid timezones and daylight-saving differences, and leave to the analyst to manipulate
timestamps as he/she/they see fit.
In addition to the configuration of your Onset credentials, you also need to set your HSDS parameters:

hsds_filename: your HDF data store name
hsds_endpoint: your HSDS endpoint

Please observe that you can also use HOBO-request locally without HSDS for testing purposes. In that case, you only
need to set hsds_filename to point to the h5 file in your local filesystem.
Finally, you must set the repository where your logger and sensor metadata are registered. This is a crucial step
because HOBO-request depends on this config to update your metadata attributes for you. You must set the following
fields:

meta_repo: git repository that contains your deployment metadata files
meta_local_dir: local directory where you want to store the repo temporarily
meta_root_path: repository directory where the HDF root file metadata file is
meta_loggers_dir: repository directory where the loggers metadata files are
meta_sensors_dir: repository directory where the sensors metadata files are

Once this is done, we can move on to the generation of your data store.
Creating your HDF data store with metadata
HOBO-request reads YAML files with metadata organized in three levels:

HDF root metadata (attributes)
HOBO loggers (attributes)
HOBO sensors (attributes)

It is necessary, therefore, to create YAML files following the scheme we describe in the conf/ and
conf/sensors/templates directories. For example:

uva-arc.yaml: provides an example of how to set root attributes in your HDF file
loggers/: provides examples for the configuration of HOBOLink loggers / stations
sensors/templates: provides examples for the configuration of HOBOLink sensors

Once you are done creating your YAML files, you can load them in your newly created HDF file.
Change directory to src/, then run hobo-config.py:
To generate your HDF file with root attributes:
./hobo-config.py test.h5 ../conf/uva-arc.yaml

Or, if you are using the pip package, you can just run:
$ hoboconfig test.h5 ../conf/uva-arc.yaml

Please note that you have to change the PATH for the YAML file in the command above.
You can find real examples for stations and sensors in this repository.
To load your logger and sensor metadata, you can type:
./hobo-config.py test.h5 /tmp/deployment-metadata/alaska/loggers/*
./hobo-config.py test.h5 /tmp/deployment-metadata/alaska/sensors/*

Or... if you are using the pip package hoborequest:
$ hoboconfig test.h5 /tmp/deployment-metadata/alaska/loggers/*
$ hoboconfig test.h5 /tmp/deployment-metadata/alaska/sensors/*

Please note that it is necessary to run this step only once when creating the HDF data store.
After you load the HDF file onto HSDS, HOBO-request will automatically pull all the changes
from the git repository containing your YAML files and load all the metadata information
for you. There is no need to use hobo-config manually again.
Running HOBO-request to consume sensor data
Now that everything is in place, you have the choice of running HOBO-request locally for tests
using a regular HDF file (and h5py) or run the HDF data store on HSDS (with h5pyd). This is
done automagically for you, so you just have to point in your config file if you want to use
a regular file or interface with HSDS to feed data to your HDF data store.
You can start HOBO-request with a simple command:
$ ./hobo-connect.py

Or, if you are running the pip package:
$ hoboconnect

Et... voilĂ ! The application will return the following message as soon as it is done processing
the data:
[HOBO-connect]: data processing completed

Generating a container for HOBO-request
We have provided scripts and deployment files to generate a container for HOBO-request, plus
documentation on how to run HSDS on k3s (or microk8s) for local testing.
You can run the following commands to build your container and run it:
$ ./build.sh
$ ./docker_run.sh

The directory k8s/ contains the deployment files and scripts for running HOBO-request on k8s or microk8s.
May the source be with you in your environmental studies!
Licensing
See LICENSE and AUTHORS files for details.

License

For personal and professional use. You cannot resell or redistribute these repositories in their original state.

Customer Reviews

There are no reviews.