Main Page: Difference between revisions

From ATLAS Accelerator In-Flight Beam Program
Jump to navigation Jump to search
No edit summary
Line 9: Line 9:


== Experiment List ==
== Experiment List ==
[[ExpList]]
*infl1 - 19O commissioning [Jul / Aug 18]
*infl1 - 19O commissioning [Jul / Aug 18]
*infl2 - 16C development [Aug / Oct 18]
*infl2 - 16C development [Aug / Oct 18]

Revision as of 16:14, June 23, 2022

Landing Page for the ATLAS In-Flight Beam Wiki

The goal of this wiki is to have access to detector (locations, status, etc.), electronics, hardware, software, targets, and misc, information pertaining to in-flight beam production. The actual data collection and analysis information can be found on the ELOG.

Tools for In-Flight Beam Tuning

Experiment List

ExpList

  • infl1 - 19O commissioning [Jul / Aug 18]
  • infl2 - 16C development [Aug / Oct 18]
  • infl3 - 30P development [Oct 18]
    infl4 - 16C delivered to MUSIC / SPS [Dec 2018, Feb 2019]
    infl5 - 30P delivered to Gretina/FMA/GODDESS [Feb/March 2019]
    infl6 - 12B delivery to HELIOS [Apr/May 2019]
    infl7 - 8Li delivery to HELIOS [May/June 2019]
    infl8 - 29Al,31Si development [June 2019]
    infl9 - 31Si to HELIOS ATLAS 1830 Wilson [June 2019]
    infl10 - 22Mg development for SPS/MUSIC [July 2019]
    infl11 - 29Al delivery to HELIOS [July 2019]
    infl12 - 16m,gN delivery to HELIOS [October 2019]
    infl13 - 44Ti development to SPS/MUSIC [November 2019]
    infl14 - 14O delivery to SPS/MUSIC [December 2019]
    infl15 - 14O development to SPS/MUSIC [March 2020]
    infl16 - 16N test / iso measure HELIOS [October 2020]
    infl17 - 29Al delivery to HELIOS [November 2020]
    infl18 - 22Mg delivery to SPS/MUSIC [December 2020]
    infl19 - Development w/ 20Ne beam to SPS [February 2021]
    infl20 - Development w/ 40Ar beam to SPS [April 2021]
    infl21 - 15C delivery to HELIOS [March 2021]
new exp template

Proposed directory / data file structure

New experiments will be placed in ~/experiments/inflXX_zAA (infl12_n16) folder on diag1. Inside folder will have: compass, BoxScore, screenshots, and data directories. BoxScore will be a git repo with current branch = exp name ->infl12_n16. Data will have links to data inside both compass and BoxScore. Data files will have labels (mostly using BoxScore) inflXX__zAA_MonDay_###.root, i.e. infl12_n16_Oct25_0.root

~/experiments/inflXX_zAA
                        /compass --> for all things compass
                        /BoxScore --> (github branch)
                        /screenshots --> for any figures
                        /data --> place for data instead of inside BoxScore / compass....

All data and compass will be backed up on diag3. On diag3 there will also be an experiments folder but only the screenshots will be filled most likely.

Hardware Information

Computers

- raisordaq - main daq computer located in SPS [ubuntu18] - diag1 - able to run desktop digitizers - diag3 - interface computer typically located in control / data room - diag2 - F-wing lab for testing - raisortab (??) - lattitude tablet for running HV and emulator

Digitizers

- desktop digitizer 1 [get S/N] 8 - ch 500 MHz - VX1730S - 16-ch VME w/ 500 MHz 14-bit 2.0Vpp, currently working with raisordaq and VME 80008X - x2 V1742 - 16-ch VME w/ 3.2 GHz 12-bit 2.5Vpp

High Voltage

- VX3718 VME Bridge for HV control in VME 8008X crate [raisordaq] - V6519P VME 6-CH HV w/ +500 V 3mA [raisordaq] - V6521M VME 6-CH HV w/ +-6 kV 300 uA [raisordaq] - Desktop HV 4-CH in F-Wing lab works with raisortab only

Preamps

- x2 RAISOR 8-ch low-gain mesytec 1 in SPS ?? - x1 4-ch low-gain mestec in SPS - x32 ch mesytec low gain (for S1 ??) in SPS

Misc

- x4 fiber optic cables

BoxScore

BoxScore is a custom made cpp program for "almost" real-time monitoring.

github : https://github.com/goluckyryan/RealTimeReading

program required libaray

CAENComm.h

CAENVMElib.h

CAENDigitizer.h

cern root

program arguments

/BoxScore boardID Location <save_file_name>
                     +-- testing 
                     +-- exit 
                     +-- cross 
                     +-- ZD (zero-degree) 
                     +-- XY (Helios target XY) 
                     +-- iso (isomer with Glover Ge detector) 
                     +-- IonCh (IonChamber)

The boardID can be checked by running DetectDigitizer

program running flow (SOME OF THIS IS OUT OF DATE 11/19)

The source code is src/BoxScore.c

  1. When started, it read the current date and time from the system, and format the default save_file_name.
  2. Based on which Location, it will make a ChannelMask and set the dE and E channels.
  3. Read the generalSetting.txt
  4. Read setting_X.txt for Channel X setting
  5. Open digitizer and setting digitizer from the setting
  6. Make root file for saving the data and histogram, Make Canvas for display
  7. Readout Loop
    1. check keyboard is hit, get keyboard hit
    2. if start acquisition
      1. retrieve data from the digitizer
      2. every 1 sec (the time period can be set at generalSetting.txt), sorting event based on timestamp, and build event
    3. if stop acquisition
    4. if cut creator is needed, the acquisition will be stop and load the program CutCreator
    5. if clear histogram
    6. if quit

Algorithm of event building

Grafana + InfluxDB

Current setting on June 14, 2019

When BoxScore is running at diag1, it will push the totalRate and (if any) cuts rates to the InfluxDB service at diag1 using

void WriteToDataBase(TString databaseName, TString seriesName, TString tag, float value){
   TString databaseStr;
   databaseStr.Form("influx -execute \'insert %s,%s value=%f\' -database=%s", seriesName.Data(), tag.Data(), value, databaseName.Data());
   system(databaseStr.Data());
}

The databaseName = RAISOR_exit

The seriseName = totalRate / cut1 / cut2 .... etc

The tag = exit / cross / ZD


InfluxDB

InfluxDB server is hosted at diag1.

InfluxDB server can be access via port 8086.

If the InfluxDB server is not started, it can be started by

sudo service influxdb start

Grafana

The Grafana server is hosted at diag3.

The Grafana webpage can be access using any browser

http://diag3.onenet:3000 

The login user name is admin, password is the same as diag3 login.

if the server not started (for example, the webpage is not loaded but connection to diag3 is ok.) it can be started by

sudo service grafana-server start