How to automate measurements with Python: Page 3 of 4

April 19, 2016 //By Fabrizio Guerrieri
How to automate measurements with Python
Fabrizio Guerrieri, Sr. System/Application Engineer at Maxim Integrated, considers ways of automating measurements with Python.

Talking to instruments saving data, and plotting
Look at the second part of the code:

[Download all code from this article as a text file.]

for load in loads:                                                                         # 8
    chroma.write('CURR:STAT:L1 %.2f' % load)                         # 9    
    chroma.write('LOAD ON')                                                     # 10
    time.sleep(1)                                                                          # 11
    
    temp = {}                                                                                  # 12
    daq.write('MEAS:VOLT:DC? AUTO,DEF,(@101)')                # 13
    temp['Vout'] = float(daq.read())                                            # 14
    daq.write('MEAS:VOLT:DC? AUTO,DEF,(@102)')                # 15
    temp['Iout'] = float(daq.read())/0.004                                 # 16
    
    results = results.append(temp, ignore_index=True)         # 17
    print "%.2fA\t%.3fV" % (temp['Iout'],temp['Vout'])              # 18

chroma.write('LOAD OFF')                                                        # 19
results.to_csv('Results.csv')                                                    # 20

Lines 9 to 10 configure the desired load current and turn the load on. Communicating with instruments over GPIB is all about using read/write methods and knowing the command strings the instrument accepts as indicated in the instrument manual. Similar to other programming languages, %.2f is a placeholder that is replaced at run-time with the content of the variable load. It also indicates that we want the data to be represented as a real number with two decimal digits. Line 11 generates a one-second delay, useful to make sure the instruments and the circuits have reached a steady-state condition.

Line 12 generates an empty object (called dictionary in Python) that we use to temporarily store the results of one iteration of the loop.

Lines 13 to 16 are used to measure the output voltage and current. The first command tells the instrument what we want to do (measure DC voltage with automatic scale) and the desired acquisition channel. The output voltage and current are acquired on channel 101 and 102, respectively. The second command reads the result back and stores it into temp. The data is returned as a string, so it must be converted to a real number using the float function. Also, because the DAQ is measuring voltages, we need to divide the reading by the shunt resistance (0.004 Ω) to get the correct value for the current.

Note how easy it is to save data in an organized fashion using Python and Pandas: the fields of the temp dictionary don't need to be defined in advance and are accessed using meaningful strings. There's no need to remember the relationship between column number and data as we would have to do if we had instead decided to use an array to store the data.

In line 17 we append the dictionary to the results dataframe. Note that results doesn't need to be initialized as well; every time a new line is appended, any new field will be added to the dataframe.

Line 18 is optional, but it can be useful to print the present voltage and current on the terminal, in particular for long measurements, as a way to make sure the application is still running and to know how far along it is.

In lines 19 to 20 the load is turned off and the data is saved on disk. For the latter, each dataframe object has a handy embedded method to save the data on a CSV file.

The power of dataframes
To explore the power of using Python and Pandas dataframes, just add this code between line 16 and line 17.

    temp['Vout_id'] = 1.0 - 2.5e-3*temp['Iout']                   # A
    temp['Vout_err'] = temp['Vout_id'] - temp['Vout']         # B
    temp['Pass'] = 'Yes'                                                        # C
    if (abs(temp['Vout_err']) > temp['Vout_id']*0.001):    # D
        temp['Pass'] = 'No'                                                     # E

Lines A and B generate two new fields of the dataframe. Vout_id contains the ideal DC setpoint value of the output voltage given the measured current and the ideal zero-current setpoint (1 V) and loadline. Vout_err is the absolute error between the ideal and measured voltage.

Lines D and E add the Pass field to the dataframe. The content of the field is a string indicating whether a hypothetic specification of ±0.1% on the output-voltage accuracy is met or not. In Figure 3, you can see how the saved CSV file looks in Excel. It's marvelous: numeric data and text are in the same table, and even column headers are automatically generated from the names of the dataframe fields.

Figure 3. The Python script can save data in CSV format, which easily opens in Excel.

Data analysis and plotting with Pyplot
The snippet of code described in the previous section allowed us to determine whether the output voltage was within the "tolerance band" around its ideal value. Another interesting piece of information we might want to get out of this experiment is the exact value of the loadline, which is the slope of the VOUT-vs-IOUT curve. If you don't remember how to do a linear fit of the acquired data, don’t worry, for Python has a function for it, too. Just insert this code at the end of script:

from scipy.stats import linregress                                               # A
loadline = linregress(results['Iout'], results['Vout'])                    # B
print "The loadline is %.2f mohm" % (loadline[0]*1000)            # C
print "The intercept point is %.3f V" % loadline[1]                      # D

Line A imports a single method from Scipy’s Stats module. In line B, we feed the imported linregress method with the X and Y coordinates of the points we are trying to fit. Finally, we print the result on the terminal in lines C and D. Linregress returns several results organized in an array, with the slope saved at index 0 and the intercept point at index 1. Other information available are the correlation coefficient and the standard error of the estimate.

With such a small dataset (20 points) it is possible to use Excel to generate plots. A three-line example show how that can be done in Python: just add them at the end of the script described previously (the ro parameter of the plot method indicates that I wanted to use red circle markers):

import matplotlib.pyplot as plt                            # A
plt.plot(results['Iout'],results['Vout'], 'ro')            # B
plt.show()                                                               # C

Pyplot is a module of the Python's Matplotlib library that contains plenty of methods to plot graphs. Better still, the methods have been designed to be almost identical to MATLAB’s. You can see the results of these three lines of code in Figure 4. The window and the graphics are automatically generated by Pyplot, and they appear "out of thin air" from the terminal window.

Figure 4. Pyplot lets you plot data, which eliminates having to open the data in Excel.

Python is an excellent choice to automate your laboratory setup and avoid tedious hours of measurements because it is simple to use, easy to understand, and extremely flexible and powerful. LabVIEW is, however, still the king of the GUI. In general, I think LabVIEW is better suited for applications requiring a nice graphical interface and that aren't required to execute complex loops or data processing. For instance, I still use LabVIEW to design most of the applications that are customer-facing and so need to be pretty but are rarely complicated. For all the other applications and automation needs, though, Python is now my first choice.

Design category: 

Vous êtes certain ?

Si vous désactivez les cookies, vous ne pouvez plus naviguer sur le site.

Vous allez être rediriger vers Google.