The Australian System Safety Conference (ASSC) is organised by the Australian Safety Critical Systems Association (aSCSa), a Special Interest Group of the Australian Computer Society.
The theme for the 2024 conference is Disruptive Technologies. With systems growing increasingly complex and incorporating novel and disruptive technologies, we are exploring the contribution, challenges, and benefits of disruptive technologies to System Safety and Cybersecurity. As systems increasingly incorporate novel technologies (such as artificial intelligence, machine learning, autonomous agents, cloud services, virtual/augmented reality, etc.), are current system safety processes and assurance standards appropriate?
Topics of interest include (but are not limited to) systems engineering, safety assurance, human factors, cybersecurity and social engineering, security assurance, reliance, automation, collaboration, teamwork, trust, and assurance standards.
The 2024 ASSC would like to examine and share the latest thinking and state of the art techniques for:
The conference invites original and unpublished works that advance the state-of-the-art in safety and security considerations for the development and operation of safety and security critical systems. We are particularly interested in cross-industry and cross-pollenisation, and looking for papers in:
Delegates have the option of submitting two types of papers:
Papers and presentations are published on the conference website.
The purpose of this award is to encourage research in the science of software/system engineering or the application of that science for safety and/or mission critical software-intensive systems.
Details to be announced.
Conference registration fees (except as noted) includes attendance at all technical sessions, the evening social event, and the conference proceedings.
All fees as listed above are in Australian Dollars and GST inclusive. For questions about registration, contact the ASCSA Secretary at email@example.com.
|aSCSa, ACS Member
Associate Professor, Carnegie Mellon University
Prof. Philip Koopman is an internationally recognized expert on Autonomous Vehicle (AV) safety whose work in that area spans over 25 years. He is also actively involved with AV policy and standards as well as more general embedded system design and software quality. His pioneering research work includes software robustness testing and run time monitoring of autonomous systems to identify how they break and how to fix them. He has extensive experience in software safety and software quality across numerous transportation, industrial, and defense application domains including conventional automotive software and hardware systems. He originated the UL 4600 standard for autonomous system safety issued in 2020. He is a faculty member of the Carnegie Mellon University ECE department where he teaches software skills for mission-critical systems. In 2018 he was awarded the highly selective IEEE-SSIT Carl Barus Award for outstanding service in the public interest for his work in promoting automotive computer-based system safety. In 2022 he was named to the National Safety Council’s Mobility Safety Advisory Group. In 2023 he was named the International System Safety Society’s Educator of the Year. He is the author of the books: Better Embedded System Software (2010), How Safe is Safe Enough: measuring and predicting autonomous vehicle safety (2022), and The UL 4600 Guidebook (2022).
Professor of Artificial Intelligence, University of Queensland
Tim Miller is a Professor of Artificial Intelligence in the School of Electrical Engineering and Computer Science at The University of Queensland, Meaanjin/Brisbane, Australia. His mission is to augment and amplify the capabilities of people and organisations using artificial intelligence. His research draws on machine learning, reinforcement learning, AI planning, interaction design, and cognitive science, to help people to make better decisions. He has done work on areas including explainable AI, human-AI mixed-initiative planning, and human-centered decision support. Prior to his appointment at The University of Queensland, he was a Professor of Computer Science in the School of Computing and Information Systems at The University of Melbourne, where he was founding co-director of The Centre for AI and Digital Ethics.
CEO, KJR & CTO, Datarwe
Dr Kelvin Ross has over 30 years of experience in software engineering and enterprise IT applications. Kelvin started his IT career in safety critical software engineering in defence, working on FA-18 airborne radar systems. After completing his PhD in safety critical systems engineering and several years in consulting in defence and transportation systems, he moved over to the commercial sector and founded KJR, specializing in software testing and assurance, which now has over 80 consultants in Sydney, Canberra, Melbourne and Brisbane.
Kelvin is recognized as expert in testing and assurance of software applications across a broad range of industry domains, including e-health, public administration, finance, insurance, retail and telecommunications. In addition to Kelvin’s role as Chairman of KJR, Kelvin is a Director of non-profit Healthcare AI Innovation Hub, IntelliHQ, and broadly engages in technology advisory roles, including director roles AI innovative startups. He has broad interests in Machine Learning, which he sees as the dominant technology driver for the next several decades, particularly within the Healthcare sector.
Kelvin is an Associate Adjunct Professor at the Institute for Intelligent and Integrated Systems (IIIS), Griffith University, and organiser of Young Women Leaders in AI, Gold Coast AI and Machine Learning meetup group, member of ACS AI Ethics national committee, has held several roles in national technical working groups (NATA and ACS), and held several board positions.
This years conference will be held at Customs House at 399 Queen St, Brisbane City QLD.