In order to enable an iCal export link, your account needs to have an API key created. This key enables other applications to access data from within Indico even when you are neither using nor logged into the Indico system yourself with the link provided. Once created, you can manage your key at any time by going to 'My Profile' and looking under the tab entitled 'HTTP API'. Further information about HTTP API keys can be found in the Indico documentation.
Additionally to having an API key associated with your account, exporting private event information requires the usage of a persistent signature. This enables API URLs which do not expire after a few minutes so while the setting is active, anyone in possession of the link provided can access the information. Due to this, it is extremely important that you keep these links private and for your use only. If you think someone else may have acquired access to a link using this key in the future, you must immediately create a new key pair on the 'My Profile' page under the 'HTTP API' and update the iCalendar links afterwards.
Permanent link for public information only:
Permanent link for all public and protected information:
(Square Kilometre Array Organisation), Xavier Espinal
EOSC-Future amendments to the original proposal have been submitted.
WP2/WP5 joint workshop, two half-day sessions to happen between 29 march and 9 april.
Doodle poll to be circulated next week by Yan.
(almost) Finished the round of updates from the participating institutes (IN2P3 next week).
Thanks for your contributions so far. We have good material to discuss among the Task leads and define a program of work that fulfills both partner institutes interests and the project roadmap.
Identified several common fields of interests between sciences and partner institutes, this will driving the second phase of the Data Lake prototype.
A mix between scientific and technological goals are in the pipeline and looking promising for a challenging full-scale test of the prototype in a year from now.
Possibility to leverage the Data Lake infrastructure with end-user oriented and/or educational purposes through the Analysis Platforms and Notebooks Infrastructure.
In particular there is a common interest to collaborate with CS3MESH4EOSC as we have common partners and experiments (e.g. LOFAR) that could lead to a promising proof of concept, involving the full chain from students working on notebooks to data ingestion and distribution.
Scope for the next months scope is to achieve significant understanding on how the ESCAPE Data Lake can integrate external and heterogeneous resources: commercial clouds, site’s ephemeral resources and HPCs. This is milestone M30 of the project.
(I) Need to reactivate the synergies with ARCHIVER to activate the ISO 16363 certification process for CERN and some of the partners in ESCAPE (DESY and PIC)
Milestones and deliverables review
M2.5 (M30): Extension of the data lake to efficiently serve data to external compute resources providers
Means of verification: Progress report; Monitoring web site
In progress, LAPP is taking the lead (Frederic Gillardo)
M2.6 (M32): ISO 16363 certification process underway in core data centres
Means of verification: Progress report; core data centres finished self-certification audit and ready to submit to external audit.
Updates from participating institutes
I would like to have updates from participating institutes: status, plans (prototype phase), particular fields of interests, experiment links, etc. We could dedicate the next two meetings to this activity and to align it with the different task plans. I propose: