Power Supply Data

The Near Detector for the NOvA project currently makes use of nine Wiener PL506 power supplies. These power supplies have an ethernet interface but because of security, they are all locked safely behind a firewall within Fermilab's network. However, we need access to the data at the Univ. of Virginia. The solution I developed was to make use of Google Spreadsheets as data storage within the internet cloud. Almost all of Google's services have a xml-based API for access and control of their data. Making use of a Python library for the Google Spreadsheet API, I created a script that runs on a computer at Fermilab every few minutes. It reads all of the data from the power supplies, formats the data and then sends it to a Google Spreadsheet with all of the data entered into a single row. New data is always entered in the last row of the spreadsheet. Below, you will find iGadgets from Google which provide graphical access to the data. Following the iGadgets is a description of the script file and other resources used to collect the data. A gzipped tar file of the python script is also available below.

Voltages for FEBs (3.3V)


Currents for FEBs (3.3V)


Voltages for TECs (24V)


Currents for TECs (24V)


Power Supply Temperatures


Voltages for APDs (High Voltage)


Currents for APDs (High Voltage)


Power Supply Temperatures (HV)


Power Supply Temperatures [OLD]

Data Collection Python Script: sendWienerData

This is still a bit of an experiment. As I have discovered first-hand, Google has a limit of 400,000 cells per spreadsheet. This spreadsheet uses 150 columns so that's only 2666 available rows. Each row represents 3 minutes, so we can currently store a total time log of 8000 minutes or 133 hours or 5.5 days. It is very likely that the data will need to be broken up into separate spreadsheets, grouping voltage data together, current data together and status data together. This has not happened yet. However, the script that sends the data now checks the number of rows in the spreadsheet first and deletes the top-most row, or oldest data, before adding new data. So the cell limit no longer stops the collecting of data, but data that is 5.5 days old is being thrown away. That may be fine for our purposes.

If interested, here is a gzipped tar file of the python script (sendWienerData), a crontab file for causing the script to be run every 3 minutes (sendWienerData.cron) and a shell script which is required to interface cron with the python script (sendWienerData.sh). (NOTE: I now spell 'Wiener' correctly). After downloading and unpacking the archive, place all three files in your path on a linux computer. You'll need to install the python library for the Google Spreadsheet API. I used version 2.0.13 of the gdata package. I am using Python 2.4.3 so you may need to modify the python script if using a different version of Python (Python is notoriously bad at adding and removing features between versions). You may need some other python packages as well, like 'subprocess'.

To access a google spreadsheet, you will need a username and password. These are encoded with base64 and stored in an external file. So you will need to create the folder ~/sendWienerErrors in your home directory and create a file named sendWiener.ini within that folder. This file must have the password and username encoded with base64 with the password on the first line and username on the second line. You can create this file using the following commands:

$ base64 > ~/sendWienerErrors/sendWiener.ini
mystrongpw
<press Ctrl-D>
$ base64
>> ~/sendWienerErrors/sendWiener.ini
user@domain.com
<press Ctrl-D>

If you had the above password and username, your file would look like this:

$ cat sendWiener.ini
bXlzdHJvbmdwdwo=
dXNlckBkb21haW4uY29tCg==

If you have similar Wiener power supplies, you'll need to modify the mib variable in the python script to point to your MIB file so that snmpget can properly access the power supply. If you are collecting some other data, then you'll want to remove all of the uses of snmpget and come up with your own python code to acquire the data.

After making the above adjustments to the python script, keep trying to execute sendWeinerData and fixing issues that python complains about. Some errors with communicating with Google get stored in a log file under ~/sendWienerErrors so be sure to check there as well. You may also need to adjust the $PATH in sendWeinerData.sh. When cron runs its 'payload' it does not setup the shell environment variables that are typically available at a command-line, so these variables need to be setup in sendWeinerData.sh. Be sure to test executing from this shell script. Once everything works, the cron job can be started by executing:

$ crontab sendWeinerData.cron

If you ever need to stop the cron job, use:

$ crontab -r

If you want to check that it is still enabled, use:

$ crontab -l

Any output to stderr gets emailed to your linux user and you need to use the 'mail' command to view it. This should not happen with the logging that got added recently.

That's all there is to it! :-)

Need More Information?

For more information, please contact us through our contact page.

 
projects/nova/pdb_data.txt · Last modified: 2011/07/05 09:26 by sdg6h [at] Virginia [dot] EDU
Recent changes RSS feed Creative Commons License Valid XHTML 1.0 Valid CSS Driven by DokuWiki
Drupal Garland Theme for Dokuwiki