Friday, 30th June 2023

Dr Haitham S. Cruickshank – University of Surrey – More details coming soon!

Details of our previous TAS-S seminars are given below. Several of the talks were recorded and are available via this One Drive link.  Please note that the talk by Dr. Phil Koopman can be accessed via You Tube or this external link. 

Friday, 9th December, 2pm-3pm UK time
‘Discovering Unknowns on Visual Data ‘ with Yang Zhou (Loughborough University).

Unknown discovery refers to the novel information-seeking that plays a key role in autonomous robotic exploration in extreme environments, especially underwater and in space. The novel features can raise the interest and attention of agent to achieve human-like curiosity behaviour, which is beneficial to many applications such as unusual object discovery, landmarking and rare data collection, etc. Former methods, like saliency detection, require human labels with complicated procedures. With the development of deep learning on out-of-distribution feature detection, end-to-end unsupervised methods are potential solutions to detect unknown features and provide real-time navigation guidance to robots when interfering with the environment.  

Yang received his MSc at the Department of Computer Science at Loughborough University in 2018 and is about to receive his PhD with EPSRC CDT-EI at Loughborough University. He is now taking a role as a KTP Research Associate with Loughborough University and MOA Technology Ltd, Oxford. His research interests include computer vision, pattern recognition and image processing. He published papers at conferences like ACCV and IEEE SMC, and Journals like Pattern Recognition.   

‘An ML Boosted Software Engine for Next Generation Drones’ with Dr. Carl Sequeira (Flarebright).
Friday, 25th November 2022, 2pm-3pm UK time. #TASSTalk

Market studies on the commercial use of drones repeatedly and consistently indicate values in the order of billions of dollars by 2030. Yet, drones are still too unreliable to fly ‘beyond visual line of sight’ (BVLOS) and consequently regulations to-date continue to hold this growth back. This perception of unreliability has been reinforced by a recent UK Air Accident Investigation Branch (AAIB) report, which indicated that unmanned aircraft systems (UAS) accounted for a quarter of all occurrences received by the AAIB in 2021. Furthermore, this report highlighted that the predominant cause of UAS accidents was loss of control in flight. 

In this presentation, the capabilities offered by Flare Bright’s advanced Machine Learning (ML) driven software engine will be introduced. Using flight verified use case descriptions, we will show how using Flare Bright’s software as the ‘brain’ of the drone provides the ability to fly by themselves, when communications and satellite positioning signals are lost, and improves the drones ability to automatically handle adverse weather conditions and land safely. In this way, Flare Bright’s software engine is a route towards realising truly reliable BVLOS flights, thereby unleashing the potential of the unmanned aerial system market. The potential for collaboration with academia enroute to Flare Bright realising its ambitions will also be discussed.

Carl Sequeira is Flare Bright’s Engineering Manager and Aerodynamics Specialist, where he leads an engineering team pushing the boundaries of embedded AI and miniaturisation in the unmanned aircraft sector. A Chartered Engineer with the IMechE since 2015, Carl is a multi-disciplinary aeronautical engineer with a PhD in unsteady fluid mechanics from the University of Cambridge and commercial and academic research expertise spanning aerodynamics, modelling and simulation, design and test, and device performance. This expertise has been reflected in awards throughout his career, including Cambridge’s Morien Morgan prize for the ‘greatest distinction in Aeronautical Engineering’ as an undergraduate and more recently the 2016 ASME Turbo Expo Best Paper Award (Education category).

Previously, as the Senior Flight Sciences Engineer, Carl led the flight simulation team at Hybrid Air Vehicles (HAV), developed the in-house aerodynamic design capability and was a key member of the flight test team that took the innovative Airlander to the skies above Bedford, UK. He has a firm understanding of aircraft regulatory processes, having served as an EASA approved Compliance Verification Engineer for aerodynamics, performance and loads while at HAV. He was a Research Associate at Cambridge’s Whittle Lab, where he was the founding member of the Cambridge Tidal Group, and has worked on a range of aerospace related subjects in the past with Rolls-Royce in Derby-UK, ONERA in Meudon-France and the VKI in Rhode.St.Genese-Belgium.

Friday, 11th November, 2pm-3pm UK time. ‘Hierarchical Potential-based Reward Shaping from Specifications ‘ with Dr. Dejan Ničković (Austrian Institute of Technology).  #TASSTalk

The automatic synthesis of policies for robotic-control, including autonomous driving tasks through reinforcement learning relies on a reward signal that simultaneously captures many possibly conflicting requirements. We introduce a novel, hierarchical, potential-based reward-shaping approach (HPRS) for defining effective, multivariate rewards for a large family of such control tasks. We formalize a task as a partially-ordered set of safety, target, and comfort requirements, and define an automated methodology to enforce a natural order among requirements and shape the associated reward. Building upon potential-based reward shaping, we show that HPRS preserves policy optimality. Our experimental evaluation demonstrates HPRS’s superior ability in capturing the intended behavior, resulting in task-satisfying policies with improved comfort, and converging to optimal behavior faster than other state-of-the-art approaches. We demonstrate the practical usability of HPRS on several robotics applications and the smooth sim2real transition on two autonomous-driving scenarios for F1TENTH race cars.

Priv. Doz. Dr. Dejan Nickovic is a Senior Scientist in AIT Austrian Institute of Technology GmbH, in the Center for Digital Safety and Security. He is with AIT since December 2011 and focuses on research in verification and testing of complex cyber-physical systems, including autonomous systems. He is the coordinator of the national FFG project “Autonomous Driving Examiner” (ADEX) and assumes the role of the Technical Manager in the H2020 project “Foundations for Continuous Engineering of Trustworthy Autonomy” (FOCETA). Dejan Nickovic published more than 70 scientific papers that appeared in peer-reviewed international journals, conference and workshop proceedings.

Logo for Arthur's Legal, a large yellow A next to the company name

‘Trusted Data Sharing (TDS); sharing data based on trust in dynamic (eco)system life cycles’. Dr. Arthur van der Wees, Arthur’s Legal

TAS-S Seminar.  Friday, 18th March 2022

Arthur van der Wees is managing director of Arthur’s Legal, Strategies & Systems, an international research-based strategic and legal organization that covers the unique combination of technology, strategy, impact, ethics & law focusing on (inter)national, regional and global strategy & policy aspects in this Digital Age. It has a global practice with multiple relevant programs/projects in the UK, EU and US in the public, private and public-private sectors, on security, sovereignty, safety, internet & identity of things, data sharing, human-centricity, nuances of trust, trustworthy cyber-physical ecosystems of ecosystems, dynamic assurance and accountability.

He is (co-)author of various publications about innovation, digital transformation, data, computing, IoT, robotics, AI, manufacturing, autonomous systems, security, safety and privacy and trust. He has contributed to several regulations, standards and policy instruments for the Digital Age. Furthermore he is advisory board member respectively partner in more than 15 European projects.

‘Trust and Governance for Autonomous Vehicle Deployment’, Dr. Phil Koopman, Carnegie Mellon University. Tuesday, 1st March 2022.

In this talk, Dr. Koopman explored the complications of AV safety, industry myths, deployment governance, and regulation and trust. This talk can be accessed via You Tube or this link.

Logo for BAE Systems. The company name is written in white capitals on a red background.Industrial Perspectives of Artificial Intelligence” with Prof. Nick Colosimo, BAE Systems.

This talk addressed the following:

  • A discussion of key trends in AI and autonomous systems from a defence industrial perspective.
  • Application areas and solutions in terms of products, services, and process improvement.
  • Outstanding challenges from a defence industrial perspective-relevant to safety and security.
  • Views of how those challenges could be addressed
  • Future catalysts and “game changers”.


Prof. Nick Colosimo started with BAE Systems (then British Aerospace) in 1990 as a technical apprentice.  In his current role he defines technology strategy and planning, and provides innovative solutions to hard technical problems in the context of the future combat air system (FCAS) project.  He is also the Principal Technologist for Disruptive Technologies and a visiting Professor at Cranfield University.

If you would like to find out more about this talk or request a copy of the slides, please email Prof. Colosimo directly. 

Logo for Spirent. A green and blue circle is shown next to the company's nameInvited Industrial Talk from Spirent on Trustworthy Autonomy.

Spirent is a British FTSE 250 company in telecommunications, navigation, and autonomy. They are industrial sponsors of many MSc and research project at Cranfield University. 

Logo for Airbus‘Security challenges for collaborative autonomous aircraft systems’ with Dr. Cora Perner, Airbus.  

Increasing autonomy is an emerging topic for both civil and military aircraft systems. However, the increased connectivity of previously isolated services in combination with legacy leaves such systems vulnerable to cyber attacks.  This talk covered challenges related to securing autonomous aircraft systems operating in the same airspace as crewed aircraft. The focus was on establishing trust with potential collaborators as well as on investigating the impact of a propagating attack on the success of a collaborative mission.

Dr. Perner is a Cybersecurity Aeronautics Architect with Airbus Cybersecurity and leads several research projects. She completed her PhD in Computer Science from the Technical University of Munich and a degree in Aerospace Vehicle Design from Cranfield University.

Logo for TTTech. The name of the company is written in blue on a white background‘Safe Autonomous Systems:  Challenges and Potential Solutions’ with Wilfried Steiner, TTTech Labs.

Over the last decades we have managed to build quite sophisticated dependable systems, like airplanes or power plants. However, the complexity of autonomous systems like self-driving cars is unprecedented, and so is their safety assurance. This talk discussed a conceptual architecture as the foundation for safe autonomous systems, followed by practical design considerations and challenges. Some  formal verification studies and discuss possible strategies to achieve dependability of systems that incorporate ML components were also presented.

Wilfried Steiner is the Director of the TTTech Labs which acts as center for strategic research as well as the center for IPR management within the TTTech Group. He holds a degree of Doctor of Technical Sciences and the Venia Docendi in Computer Science, both from the Vienna University of Technology, Austria.

His research is focused on dependable cyber-physical systems for which he designs algorithms and network protocols with real-time, dependability, and security requirements.    

Logo for Thales. The name Thales is spelt our in black with a blue-green circles in the centre of the letter ASecure-by-Design – the challenge of moving beyond Cyber Risk Management to Cyber Resiliency’ with Dr. Alex Tarter, Thales

One of the challenges for companies building modern critical infrastructure is that you need to design highly complex interconnected systems that can withstand a changing cyber attack landscape over the long-term. Our traditional risk management approaches don’t let us take into consideration emergent properties, critical interdependencies or the ability to continuously change.

This talk described some of the approaches and ideas Thales is employing to try and integrate continuous assurance and resilience into our critical systems – so that we can trust that they will work as expected.  

Dr. Tarter has been working in the fields of Defence and Critical National Infrastructure Cyber security for over 15 years.  As the CTO-Cyber, he is responsible for shaping the technical strategy and cyber capabilities of Thales UK. This includes leading the Thales UK Cyber Competence Centre, cyber-related R&D, and global product line management for our cyber security consulting offers.  

Logo for AIT (Austrian Institute of Technology)Security by Design for IOT and Automotive using THREATGET’ with Dr. Willibald Krenn, Austrian Institute of Technology (AIT). 

AIT is developing a security-by-design tool called THREATGET. In this talk, Dr. Krenn presented the current state of the tool and how it can be applied in the IoT and automotive domains using some examples. After discussing the basics of the tool and the threat modelling approach, Dr. Krenn discussed current challenges and research, like automated security attribute selection.

Dr. Krenn is the Thematic Coordinator of AIT’s Dependable Systems Engineering group and holds a PhD in computer science from Graz University of Technology. 

Logo of the University of Glasgow. A shield is shown to the left of the name of the university, on a blue background“Towards Safe, Trustworthy and Efficient Autonomous Vehicles” (19th August 2021)
Dr. Dezong Zhao EPSRC Innovation Fellow, University of Glasgow.

The main challenge in autonomous driving is to handle uncertainties. It proposes rigorous requirements that autonomous vehicles need to guarantee safe and trustworthy decision making. To make this realistic, autonomous vehicles have to be interpretable, adaptable, verifiable and robust. The goals would be achieved by developing transparent and reliable tools in perception, planning, modelling and control. Moreover, the current autonomous vehicles are power hungry so green driving solutions are expected. To this end, developing ecological driving strategies and event-camera-based perception falls into our research interest.

TAS-S/Security Lancaster seminar series
Previous seminars can be found on the Security Lancaster archive pages.

TAS Doctoral Training Network seminar series
Details of upcoming and previous seminars can be found on the TAS DTC webpages. 

Logo of the TAS-R node. Shows an orange circle on a wavy blue background

TAS Node in Resilience seminar series
Details of upcoming and previous seminars can be found on the TAS-R Node’s events pages.   

Logo for the TAS Node in Governance and Regulation. Two hands looping over each other so that each can write on the wrist of the other.

TAS Node in Governance and Regulation
Details of upcoming and previous seminars can be found on the TAS-G Node’s events pages.

Logo of the TAS Node for Verifiability. An outline of a car against a red background.

TAS Node in Verifiability
Details of upcoming and previous seminars can be found on the TAS-V Node’s  events pages.