.. _heros in atomiq: HEROS in ATOMIQ ############### Atomiq is perfectly suited to describe and control the realtime-hardware in the setup. However, a good deal of the hardware orchestration concerns non-realtime devices that are not directly connected to the realtime control system but rather have to be controlled through a vast variety of different interfaces. To make all these devices available in the atomiq experiments, the `HEROS `_ framework is used. The idea behind HEROS is that every hardware object becomes a software object that can be transparently accessed via the network. To seamlessly interact with non-realtime hardware and to make atomiq a part of the ecosystem, HEROS is fully integrated into atomiq and atomiq itself is integrated into the HEROS network. .. note:: HEROS is a peer-to-peer network and does not require a central server. Rather it uses UDP broadcasts to discover other participants in the network. It will thus automagically work within an unrestricted network segment. Note however that services running in a docker container can not send or receive UDP broadcasts from or to the host network. In such cases the container (HERO and or atomiq) can be configured to use the host network or, alternatively, a zenoh router can be used to broker the discovery (only the discovery, not the communication!). Using HEROs in the Atomiq experiment ==================================== Using an available HERO in your atomiq experiment is a easy as putting it's name to your components list. Let's assume there is a dummy voltage source available as a HERO with the name ``rfsource_dummy`` in the default HEROS realm ``heros``. By prefixing the HERO name with the magic identifier "$" you can state in your components list that the experiment requires the HERO like shown below .. code-block:: python from atomiq import AtomiqExperiment class HerosTest(AtomiqExperiment): components = ["herosink", "$rfsource_dummy"] @kernel def step(self, point): self.rfsource_dummy.set_amplitude(0.56) .. note:: If your hero is not in the default realm, you can use the syntax ``$realm/hero_name`` to reference the hero. If doing so in the components list, the hero becomes available as ``self.realm_hero_name`` Any HERO imported this way inherits from ``Component`` inside ATOMIQ. However, if the HERO specifies in it's metadata that it is compatible with a more specific class inside atomiq.components then it is based off this more specific class and atomiq assumes that the necessary interface exists in the HERO. This way, it is possible to have non-realtime devices like RF sources, voltage and current sources, ADCs, simples switches, etc. look exactly like their realtime counterparts from a software perspective. Inside atomiq, you can use all methods, attributes and events that the HERO exposes. If your HERO carries proper type annotations, you can also access the HERO from within kernel code sections (of course this then issues an rpc call into the artiq master who then does the calling of the HERO). For more details on how to make your device a HERO and how the definition of the atomiq interfaces in HEROS work, check the documentation of `HEROS `_ and `herosdevices `_. If your HERO implements an ATOMIQ component interface, it can be used as any other component of that type. That allows also reference it in the component definition in the very same way. For example an AOM that is driven by an RF source which is controlled by a HERO ``rfsource_dummy`` which implements atomiq.components.electronics.rfsource.RFSource could look like the following in your components definition .. code-block:: python { "aom_cooler": { "classname": "atomiq.components.optoelectronics.lightmodulator.AOM", "arguments": { "rfsource": "$rfsource_dummy", "switch": "$rfsource_dummy", "center_freq": 80e6, "bandwidth": 40e6, "switching_delay": 30e-9, "order": 1, "passes": 2 } } } ATOMIQ becoming a HERO ====================== ATOMIQ comes with it's own frontend to start the artiq master in ``atomiq.frontend.atomiq_master``. Using this starter allows to modify artiq such that the artiq scheduler becomes itself a HERO. This allows to have a HERO for managing the execution of experiments. In turn this allows to have external logic acting on your experiment-analysis-experiment cycle. Re-scheduling an experiment with different parameters or scheduling a completely different experiment depending on the outcome of a previous experiment or the computation result of some other HERO (think AI) becomes easily possible with this feature. Scheduling Experiments ====================== In addition to the scheduler itself, the experiment database is available as a HERO upon starting the ``atomiq.frontend.atomiq_master``. Using this database, scheduling an experiment becomes as easy as: .. code-block:: python from heros import RemoteHERO scheduler = RemoteHERO("atomiq-scheduler") db = RemoteHERO("atomiq-experimentdb") # list all experiments (the ones you would see in the dashboard) db.list_experiments() # generate a new run definition (internally denoted as expid in ARTIQ) expid = db.get_experiment("examples/ATQExperiment") # modify an argument expid["dummy_voltage"]["repetitions"] = 10 # schedule the experiment to the "main" pipeline rid = scheduler.submit("main", expid) print(f"Experiment with class name {expid['class_name']} scheduled as RID {rid}") Duplicating Running Experiments =============================== The run configuration (the ``expid``) of a running experiment can be extracted via the scheduler, saved and run again. Note that this method does not save the state of the script itself, it merely saves the parameter configuration from the dashboard. .. code-block:: python import json from heros import RemoteHERO scheduler = RemoteHERO("atomiq-scheduler") # get the expid from the running experiment with RID 666 rid = 666 expid = scheduler.get_expid(rid) # save it to a json-file with open("rid_666.json", "w") as fp: json.dump(expid, fp, indent=4) Using Experiment HEROs ====================== All scheduled and running experiments are HEROs as well. This allows you to interact with the living experiment from remote. You can readout important internal variables (step_counter, rid, etc) and request termination, or add some custom functions. These experiment HEROs also provide an event ``emit_data`` that can be used to broadcast data out of your experiment, as described in the following section. The scheduler HERO triggers an event ``run_started`` when a new experiment enters the run phase and an event ``run_ended`` when the run phase finished.The metadata transmitted with this event contains all information to get the HERO of the just starting experiment. The class-level attribute ``atomiq.AtomiqExperiment.TAGS`` can be used to tag the experiments. During creation of the ``AtomiqHERO``, these tags populate ``heros.LocalHERO._hero_tags`` and can - for example - be accessed to trigger downstream actions: .. code-block:: python class SampleSpectroscopy(AtomiqExperiment): TAGS = ["calibration", "analyze_spec"] # the rest of the experiment # Bob's machine where the analysis should run from heros import RemoteHERO scheduler = RemoteHERO("atomiq-scheduler") def handle_run_started(data): atomiq_run_hero = RemoteHERO(data["hero_name"]) if "analyze_spec" in atomiq_run_hero._hero_tags: is_calibration = "calibration" in atomiq_run_hero._hero_tags # pass the run to the analysis worker do_analysis(atomiq_run_hero, publish=is_calibration) Using the Experiment HERO as Data Sink ====================================== with the following in your atomiq components dict: .. code-block:: python "herosink": { "classname": "atomiq.components.basics.datasink.HEROSink" } broadcasting your data becomes as easy as .. code-block:: python from atomiq import AtomiqExperiment class HerosTest(AtomiqExperiment): components = ["herosink"] @kernel def step(self, point): # emit env data (like run ids, arguments for the current step, etc.) self.herosink.submit_env(point) # emit custom data self.herosink.submit_data(["my_variable", "your_variable"], [30.21, -13.0]) To access the data put into the herosink, you can get a reference to the experiment HERO and connect your analysis or processing method to the event `emit_data`. This could look as the following: .. code-block:: python from heros import RemoteHERO def process_and_print(data): # processing... print(data) experiment = RemoteHERO("atomiq-run-rid") experiment.emit_data.connect(process_and_print) The `RemoteHERO` of the currently running experiment can additionally be retrieved from the scheduler: .. code-block:: python from heros import RemoteHERO def process_and_print(data): # processing... print(data) scheduler = RemoteHERO("atomiq-scheduler") experiment = scheduler.get_current_run() if experiment is not None: experiment.emit_data.connect(process_and_print)