The IFAC TC9.2 addresses the impact of systems and control outcomes on socio-technical systems and organizations, on the human individual, and on society in the global scale. It is interested in planning the systems design such to maximize benefits of control algorithms while anticipating and avoiding their possible adverse effects, as a system of systems, taking into account these three dimensions, reducing inequalities and preserving diversity of people.

If impact-studies have a fundamental importance, they become critical following the computer and internet revolution that has dramatically widened new possibilities in control solutions, with important impacts at all levels of the society. In this sense we are in the context of the digital revolution for sustainable development (Sachs et al, 2019) as detailed below.

Together with this technological revolution, because of the close interplay between humans and cyber physical systems - CPS (embedded, networked, interconnected) there was the consequent emergence of a new research field named cyber physical and human systems (CPHS).  Advances in this field and its challenges require multi/inter-disciplinary collaboration in the chain of modeling, simulating, analyzing, controlling, optimizing and evaluating cyber physical and human systems.

This joint effort for increased impact of systems and control outcomes on socio-technical systems and organizations, on the individual human, and on society in the global scale, consider then new challenges/opportunities of CPHS and should include specialists in cognition, ergonomics, psychology, neuro-ergonomics, neurosciences, law, philosophy, sociology, political science, economics together with computer and control sciences.

Indeed, designing these systems requires additional prior/post studies to the effective design as:

  • Identifying whether and when it is possible, relevant and desirable to automate a system, or keep the human in manual mode (e.g., bicycles or new mobility modes as personal transporters); how to share the control resources between algorithms and humans (e.g., autonomous vehicles, multimodality).
  • Designing ethical control systems, driven by the ideas from philosophy (e.g., the role of the engineering development to avoid moral dilemmas in automation).
  • Analyzing society transformation by recent technologies (AI/IoT/robotics), and also with the use of methods of the theory of control (for example the analysis and regulation of social network influences on the society). This post impact analysis of computer/control on society could help choosing new solutions in an optimized way.
  • Studying previewed impact of the desired system on the human individual or on the social system, prior to the design, can guide the design. This requires the development of appropriate methods and tools that consider automation, time, individuals, social and environment from a system of systems perspective.
  • Studying possible liability issues in relation to the targeted system could avoid massive investment on systems that are hardly deployable. Legal issues are indeed fundamental, and are included in social acceptance of the system, and the consequential industrial success of this system. This is typically also an economic issue and the type of regulation could also be investigated through economics of innovation and also with law and economics.
  • Using digital twins for demonstrating the benefits and impact of the control discipline on societal outcomes.
  • Working towards socially responsible automation as outlined in (Sampath & Khargonekar, 2018).
  • Considering new areas of study like the modelling and the control of pandemics show to be crucial.

In this increasing penetration of CPS in human’s lives, one shall address the compromise between complexity and reliability that include:

  • How to couple AI with control ensuring transparency, accountability, explicability and reliability? The answer to this question affects the performance (in its multiple dimensions – technological, economics, social), safety and liability of the resulting system.
  • The modelling of the human cognitive representation of the automated and of the other manually driven systems in mixed automated and manual mode machines evolving together, which can help improve decision making and the overall safety.
  • The control design shall consider time scales and capacities of the human in relation to the automated system, as well as different automation levels and different intrusion levels of the automated system with respect to the human.
  • Safety-critical and resilient CPHS shall be studied, it has indeed become a major area of research.
  • Security, privacy and ethics in cyber-physical and human systems is essential to be addressed in the present days.
  • Reversibility of systems: can we change the direction of an on-going technological transformation?

In addition, closing the gap between the theory and practice has a huge potential in increasing the impact of the control discipline on society, as well as addressing public policies related to the deployment of new technologies.

Furthermore, as outlined recently, major societal drivers as climate changes mitigation and adaptation, healthcare and ensuring quality of life, smart infrastructure systems, the sharing economy and resilience of societal-scale systems call for new CPHS tools to tackle the associated challenges (Annaswamy et al, 2023).

Many other topics relate to societal impact like space technology that has brought many new systems impacting positively our society; emotional robotics which impacts should be thoroughly studied; brain-computer interaction; teaming of humans and autonomy which design challenges and ethical issues need to be outlined, to name a few.

TC9.2 takes part in addressing the design of control/AI systems by considering these dimensions, and many others, for its huge societal benefits.