Main Page: Difference between revisions

From ATLAS Accelerator In-Flight Beam Program
Jump to navigation Jump to search
No edit summary
 
(16 intermediate revisions by the same user not shown)
Line 8: Line 8:
*:[[Detailed Steps for Tuning In-Flight Beams]] (still work in progress)
*:[[Detailed Steps for Tuning In-Flight Beams]] (still work in progress)


== Experiment List ==
== Past Beam Delivery List ==
*infl1 - 19O commissioning [Jul / Aug 18]
[[ExpList]]
*infl2 - 16C development [Aug / Oct 18]
*infl3 - 30P development [Oct 18]
*:[[infl4]] - 16C delivered to MUSIC / SPS [Dec 2018, Feb 2019]
*:[[infl5]] - 30P delivered to Gretina/FMA/GODDESS [Feb/March 2019]
*:[[infl6]] - 12B delivery to HELIOS [Apr/May 2019]
*:[[infl7]] - <sup>8</sup>Li delivery to HELIOS [May/June 2019]
*:[[infl8]] - <sup>29</sup>Al,<sup>31</sup>Si development [June 2019]
*:[[infl9]] - <sup>31</sup>Si to HELIOS ATLAS 1830 Wilson [June 2019]
*:[[infl10]] - <sup>22</sup>Mg development for SPS/MUSIC [July 2019]
*:[[infl11]] - <sup>29</sup>Al delivery to HELIOS [July 2019]
*:[[infl12]] - <sup>16m,g</sup>N delivery to HELIOS [October 2019]
*:[[infl13]] - <sup>44</sup>Ti development to SPS/MUSIC [November 2019]
*:[[infl14]] - <sup>14</sup>O delivery to SPS/MUSIC [December 2019]
*:[[infl15]] - <sup>14</sup>O development to SPS/MUSIC [March 2020]
*:[[infl16]] - <sup>16</sup>N test / iso measure HELIOS [October 2020]
*:[[infl17]] - <sup>29</sup>Al delivery to HELIOS [November 2020]
*:[[infl18]] - <sup>22</sup>Mg delivery to SPS/MUSIC [December 2020]
*:[[infl19]] - Development w/ <sup>20</sup>Ne beam to SPS [February 2021]
*:[[infl20]] - Development w/ <sup>40</sup>Ar beam to SPS [April 2021]
*:[[infl21]] - <sup>15</sup>C delivery to HELIOS [March 2021]
:[[new exp template]]


==Proposed directory / data file structure==
==Proposed directory / data file structure==
Line 45: Line 24:


=== Computers ===
=== Computers ===
- raisordaq - main daq computer located in SPS [ubuntu18]
- raisordaq - main daq computer located in SPS [ubuntu18]
- diag1 - able to run desktop digitizers
- diag1 - able to run desktop digitizers
Line 53: Line 31:


=== Digitizers ===
=== Digitizers ===
- desktop digitizer 1 [get S/N] 8 - ch 500 MHz
- VX1730S - 16-ch VME w/ 500 MHz 14-bit 2.0Vpp, currently working with raisordaq and VME 80008X
- x2 V1742 - 16-ch VME w/ 3.2 GHz 12-bit 2.5Vpp


- dig1, dig2
=== High Voltage ===
- VX3718 VME Bridge for HV control in VME 8008X crate [raisordaq]
- V6519P VME 6-CH HV w/ +500 V 3mA [raisordaq]
- V6521M VME 6-CH HV w/ +-6 kV 300 uA [raisordaq]
- Desktop HV 4-CH in F-Wing lab works with raisortab only


{{Template:Standard Footer}}
=== Preamps ===
 
- x2 RAISOR 8-ch low-gain mesytec 1 in SPS ??
- x1 4-ch low-gain mestec in SPS
- x32 ch mesytec low gain (for S1 ??) in SPS
 
=== Misc ===
- x4 fiber optic cables


= BoxScore =
= BoxScore =
Line 111: Line 103:
= Grafana + InfluxDB =
= Grafana + InfluxDB =


== Current setting on June 14, 2019 ==
== Current setting on July 2022 ==


When BoxScore is running at diag1, it will push the totalRate and (if any) cuts rates to the InfluxDB service at diag1 using  
When BoxScore is running at '''raisordaq''', it will push the totalRate and (if any) cuts rates to the InfluxDB service at '''raisordaq''' using  


  void WriteToDataBase(TString databaseName, TString seriesName, TString tag, float value){
  void WriteToDataBase(TString databaseName, TString seriesName, TString tag, float value){
Line 121: Line 113:
  }
  }


The databaseName = RAISOR_exit
The database is actually '''db'''


The seriseName = totalRate / cut1 / cut2 .... etc
The databaseName in Grafana is = '''RAISOR_db'''
 
The seriesName = totalRate / cut1 / cut2 .... etc


The tag = exit / cross / ZD
The tag = exit / cross / ZD


== InfluxDB ==
As of July 2022:
InfluxDB server is hosted on raisordaq
The access port is '''8086'''
sudo service influxdb start
sudo service influxdb status


== InfluxDB ==
http://raisordaq.onenet:8086
 
The database name in influxdb is just '''db'''
 
--- OLD ---


InfluxDB server is hosted at diag1.  
InfluxDB server is hosted at diag1.  
Line 151: Line 160:


  sudo service grafana-server start
  sudo service grafana-server start
--
Working to get grafana on new iMAC.
to start grafana as lcladmin
brew services restart grafana
To access the Grafana Dashboard use port 3000 (and hopefully):
http://raisormac.onenet:3000
else try
http://localhost:3000
both have uname: admin pswrd: longbaseballone
Notes on Install:
Started install with local admin and did NOT get it running
under ''raisor'' uname
Under lcladmin
brew install grafana
was successful. The ''grafana.ini'' file is located at
/opt/homebrew/etc/grafana
Modified file to change [server] domain=raisormac.onenet.
Should cp some settings from diag3 ini file ??
= Beam Current Integrator =
== Keithely ==
''/home/phy/raisor/Keithley'' is local initial location of '''keithley6514.py''' file
Also, added to OptSB repo.
To run, connect to USB on front, likely at location ''/dev/ttyUSB0'' then
python keithley6414.py
= To Do List =
[[ToDo]]

Latest revision as of 21:25, July 18, 2022

Landing Page for the ATLAS In-Flight Beam Wiki

The goal of this wiki is to have access to detector (locations, status, etc.), electronics, hardware, software, targets, and misc, information pertaining to in-flight beam production. The actual data collection and analysis information can be found on the ELOG.

Tools for In-Flight Beam Tuning

Past Beam Delivery List

ExpList

Proposed directory / data file structure

New experiments will be placed in ~/experiments/inflXX_zAA (infl12_n16) folder on diag1. Inside folder will have: compass, BoxScore, screenshots, and data directories. BoxScore will be a git repo with current branch = exp name ->infl12_n16. Data will have links to data inside both compass and BoxScore. Data files will have labels (mostly using BoxScore) inflXX__zAA_MonDay_###.root, i.e. infl12_n16_Oct25_0.root

~/experiments/inflXX_zAA
                        /compass --> for all things compass
                        /BoxScore --> (github branch)
                        /screenshots --> for any figures
                        /data --> place for data instead of inside BoxScore / compass....

All data and compass will be backed up on diag3. On diag3 there will also be an experiments folder but only the screenshots will be filled most likely.

Hardware Information

Computers

- raisordaq - main daq computer located in SPS [ubuntu18] - diag1 - able to run desktop digitizers - diag3 - interface computer typically located in control / data room - diag2 - F-wing lab for testing - raisortab (??) - lattitude tablet for running HV and emulator

Digitizers

- desktop digitizer 1 [get S/N] 8 - ch 500 MHz - VX1730S - 16-ch VME w/ 500 MHz 14-bit 2.0Vpp, currently working with raisordaq and VME 80008X - x2 V1742 - 16-ch VME w/ 3.2 GHz 12-bit 2.5Vpp

High Voltage

- VX3718 VME Bridge for HV control in VME 8008X crate [raisordaq] - V6519P VME 6-CH HV w/ +500 V 3mA [raisordaq] - V6521M VME 6-CH HV w/ +-6 kV 300 uA [raisordaq] - Desktop HV 4-CH in F-Wing lab works with raisortab only

Preamps

- x2 RAISOR 8-ch low-gain mesytec 1 in SPS ?? - x1 4-ch low-gain mestec in SPS - x32 ch mesytec low gain (for S1 ??) in SPS

Misc

- x4 fiber optic cables

BoxScore

BoxScore is a custom made cpp program for "almost" real-time monitoring.

github : https://github.com/goluckyryan/RealTimeReading

program required libaray

CAENComm.h

CAENVMElib.h

CAENDigitizer.h

cern root

program arguments

/BoxScore boardID Location <save_file_name>
                     +-- testing 
                     +-- exit 
                     +-- cross 
                     +-- ZD (zero-degree) 
                     +-- XY (Helios target XY) 
                     +-- iso (isomer with Glover Ge detector) 
                     +-- IonCh (IonChamber)

The boardID can be checked by running DetectDigitizer

program running flow (SOME OF THIS IS OUT OF DATE 11/19)

The source code is src/BoxScore.c

  1. When started, it read the current date and time from the system, and format the default save_file_name.
  2. Based on which Location, it will make a ChannelMask and set the dE and E channels.
  3. Read the generalSetting.txt
  4. Read setting_X.txt for Channel X setting
  5. Open digitizer and setting digitizer from the setting
  6. Make root file for saving the data and histogram, Make Canvas for display
  7. Readout Loop
    1. check keyboard is hit, get keyboard hit
    2. if start acquisition
      1. retrieve data from the digitizer
      2. every 1 sec (the time period can be set at generalSetting.txt), sorting event based on timestamp, and build event
    3. if stop acquisition
    4. if cut creator is needed, the acquisition will be stop and load the program CutCreator
    5. if clear histogram
    6. if quit

Algorithm of event building

Grafana + InfluxDB

Current setting on July 2022

When BoxScore is running at raisordaq, it will push the totalRate and (if any) cuts rates to the InfluxDB service at raisordaq using

void WriteToDataBase(TString databaseName, TString seriesName, TString tag, float value){
   TString databaseStr;
   databaseStr.Form("influx -execute \'insert %s,%s value=%f\' -database=%s", seriesName.Data(), tag.Data(), value, databaseName.Data());
   system(databaseStr.Data());
}

The database is actually db

The databaseName in Grafana is = RAISOR_db

The seriesName = totalRate / cut1 / cut2 .... etc

The tag = exit / cross / ZD

InfluxDB

As of July 2022:

InfluxDB server is hosted on raisordaq

The access port is 8086

sudo service influxdb start
sudo service influxdb status
http://raisordaq.onenet:8086

The database name in influxdb is just db

--- OLD ---

InfluxDB server is hosted at diag1.

InfluxDB server can be access via port 8086.

If the InfluxDB server is not started, it can be started by

sudo service influxdb start

Grafana

The Grafana server is hosted at diag3.

The Grafana webpage can be access using any browser

http://diag3.onenet:3000 

The login user name is admin, password is the same as diag3 login.

if the server not started (for example, the webpage is not loaded but connection to diag3 is ok.) it can be started by

sudo service grafana-server start

--

Working to get grafana on new iMAC.

to start grafana as lcladmin

brew services restart grafana

To access the Grafana Dashboard use port 3000 (and hopefully):

http://raisormac.onenet:3000

else try

http://localhost:3000

both have uname: admin pswrd: longbaseballone

Notes on Install: Started install with local admin and did NOT get it running under raisor uname

Under lcladmin

brew install grafana

was successful. The grafana.ini file is located at

/opt/homebrew/etc/grafana

Modified file to change [server] domain=raisormac.onenet.

Should cp some settings from diag3 ini file ??

Beam Current Integrator

Keithely

/home/phy/raisor/Keithley is local initial location of keithley6514.py file Also, added to OptSB repo. To run, connect to USB on front, likely at location /dev/ttyUSB0 then

python keithley6414.py


To Do List

ToDo