pyactcv 0.0.3

Last updated:

0 purchases

pyactcv 0.0.3 Image
pyactcv 0.0.3 Images
Add to Cart

Description:

pyactcv 0.0.3

pyactcv - python interface
This library connects the cognitive architecture ACT-R with the programming language python to load user data into ACT-R's visicon.
The cognitive architecture ACT-R is able to monitor a human operator’s interactions with a system using the concept of model-tracing, a concept previously implemented within an ACT-R tutoring system [1]. This software library adapted the work of [2] to establish such a connection between the programming language python and ACT-R version 7.12. For exemplary usage of the library please see [3] and [4].

Installation
$ pip install pyactcv

or
$ pip install git+https://github.com/seblum/actcv

Usage
Take a look at the examples folder for an exemplary use case.
import pandas as pd

import actr
import pyactcv as cv

data = pd.read_csv('userData.csv', sep = ';', dtype = {'alarmactivecolumn' : float, 'alarmnumbercolumn' : float, 'timecolumn' : float})

header = list(data)
data = data.where((pd.notnull(data)), None)

frequency = 3000
duration = 3
starttime = 0
indexinput = 0
timebreak = 0.1

actcv = cv.ActCV(data, 'timecolumn' )
actcv.load_states()
actcv.schedule_visicon()
actcv.schedule_tone()

actr.run()

Files


actcv.py - Contains the class ActCV and methods to create the interface to load user data set into the visicon of ACT-R.


actr.py - Contains the dispatcher of ACT-R version 7.12., which is necessary to form a connection between python and ACT-R (see http://act-r.psy.cmu.edu/).


TODO
Possible additional feature to add:

Add more dynamic read in for data
Add selection of what to load ("visual", "audio")
Add debugging support

Developing pyactcv
To install pyactcv along with the tools to develop and run tests please run the following in your virtualenv:
$ pip install -e .[dev]

Bibliography
[1] Fu, W.-T., Bothell, D., Douglass, S., Haimson, C., Sohn, M.-H., & Anderson, J. (2006). Toward a real-time model-based training system. Interacting with Computers, 18(6), 1215–1241.
[2] Halbruegge, M. (2013). Act-cv - bridging the gap between cognitive models and the outer world. In E. Brandenburg (Ed.), Grundlagen und Anwendungen der Mensch- Maschine- Interaktion: 10. Berliner Werkstatt Mensch- Maschine-Systeme (pp. 205–210). Berlin: TU Berlin.
[3] Klaproth, O. W., Halbruegge, M., Krol, L. R., Vernaleken, C., Zander, T. O. and Russwinkel, N. (2020). A Neuroadaptive Cognitive Model for Dealing With Uncertainty in Tracing Pilots’ Cognitive State. Topics in Cognitive Science, 12(3), p. 1012-1029.
[4] in review

License:

For personal and professional use. You cannot resell or redistribute these repositories in their original state.

Customer Reviews

There are no reviews.