Tunnel Snakes Rule! Bringing the many worlds of Python together to monitor Melbourne's biggest infrastructure project.
Speaker | Evan Brumley |
---|---|
Time | 2019-08-03 11:10 |
Conference | PyCon Au 2019 |
Talk details | Link |
7 sites.
Table 59 pages long for performance requirements.
Put sensor in this location, calculate what a fish would hear, etc.
Classical approach: collect results monthly, generate monthly reports.
Modern approach: Existing cloud provider, not very flexible with requirements.
Requirements:
- Accept data from any device. Mostly via vendor platform.
- Validate and store telemetry. Losing data is a big deal, we get audited.
- Analyse and process telemetry. Calculations are complicated, and likely to change. Retrospectively.
- Provide access to data. To environmental teams and external stakeholders.
- Send alerts.
- Reporting work flows.
Time frame: 4 months to first release. 6 more months to get feature complete.
Team: WSP + Arup. Fully self contained.
- Collect Device Readings. API Pollers. Deployed using EBS. Should be asyncio.
- Buffer Device Readings. AWS Kinesis.
- Log everything for audits.
- Validate and store.
- Analyse, make available, raise alerts. Read only access to database.
AWS has fully managed Apache Flink service.
Step 5 disconnected from telemetry collection, as we don’t entirely trust our own code and don’t want to risk damaging data that we get audited on.
Pandas.