Gallop 0.0.6 | GitLocker.com Product

gallop 0.0.6

Last updated:

0 purchases

gallop 0.0.6 Image
gallop 0.0.6 Images
Add to Cart

Description:

gallop 0.0.6

gallop
🐎 New level of python develop. Source code from github

More dynamic in design, more configuration driven





[-;-/=_
`-; \=_ ___ _________ __ __ ____ ____
) ,"-...--./===--__ / ____/ | / / / / / __ \/ __ \
__|/ / ]` / / __/ /| | / / / / / / / / /_/ /
/;--> >...-\ <\_ / /_/ / ___ |/ /___/ /___/ /_/ / ____/
`- <<, 7/-.\, \____/_/ |_/_____/_____/\____/_/
`- /( `-

Install
pip install gallop

Configuration Management
from gallop.config import BaseConfig

config = BaseConfig(a=1)
config.a = 2
config.b = 3

config.to_json("some/path.json")
config.to_yaml("some/path.yaml")

Run python task from config
You can turn any callable execution in to configuration, eg save the following
func_name: use:logging.warning
args:
- "hello world"

to sometask.yaml and run gallop sometask
Is the same to run the python script
import logging
logging.warning("hello world")

param: configuration
We can change the value on the run, while pointing to the position in the config, using a chain of keys (keys to dict or list).
eg. to change the key args, of its 1st element, we can run
gallop sometask --param:args.0 changed_world

Advanced usage
Some simple grammar

checkin: somekey, save the result to a centralized dictionary with key somekey
checkout: somekey, use the result from the centralized dictionary with key somekey
checkout: env:DATA_HOME, use the result from the environment variable DATA_HOME
use:some.module, use or import the module some.module as the callable, eg

func_name: use:os.path.join, use the function os.path.join
func_name: use:pandas.DataFrame, use the class pandas.DataFrame



Examples
Inference BERT model

If you have transformers installed, you can run the following example directly to featurize a sentence

pred_task:
- func_name: use:transformers.AutoModel.from_pretrained
args:
- bert-base-uncased
checkin: model
description: |
Load pretrained model
- func_name: use:transformers.AutoTokenizer.from_pretrained
args:
- bert-base-uncased
checkin: tokenizer
description: |
Load pretrained tokenizer
- func_name: tokenizer
args:
- - "Hello, the capital of [MASK] is Paris"
kwargs:
return_tensors: pt
max_length: 128
truncation: True
padding: True
checkin: inputs
- func_name: use:logging.warning
args:
- func_name: model
kwargs:
input_ids:
checkout: inputs.input_ids
attention_mask:
checkout: inputs.attention_mask
checkin: features

This equals to the following python script
import logging
from transformers import AutoModel, AutoTokenizer

model = AutoModel.from_pretrained("bert-base-uncased")
tokenizer = AutoTokenizer.from_pretrained("bert-base-uncased")
inputs = tokenizer(
"Hello, the capital of [MASK] is Paris",
return_tensors="pt",
max_length=128,
truncation=True,
padding=True,
)
logging.warning(model(
input_ids=inputs.input_ids,
attention_mask=inputs.attention_mask,
))

Save the yaml to run_bert.yaml and you can use gallop run_bert --output features to run the task.
Run in commandline with changed value, and printout one of the checkout value
gallop run_bert --loglevel debug --output features

And run it with CHANGED VALUE
gallop run_bert 、
--param:pred_task.2.args.0.0 "The [MASK] house is where the POTUS live and work"

Related project

From the same lead author


Python category management accelerated with Rust
Data science basic toolset forgebox

License:

For personal and professional use. You cannot resell or redistribute these repositories in their original state.

Files In This Product: (if this is empty don't purchase this product)

Customer Reviews

There are no reviews.