نبذة مختصرة : The major powers focus on science and technology development in order to build military power with strategic impact. High-technology weapons, available also to non-state actors, are assumed they would shape the nature of warfare in the twenty-first century. Semiconductors, cloud computing, robotics, and big data are all part of the components needed to develop the AI that will model and define the future battlespace. Artificial intelligence will apply to nuclear, aerospace, aviation and shipbuilding technologies to provide future combat capabilities. The incorporation of AI into military systems and doctrines will shape the nature of future warfare and, implicitly, will decide the outcome of future conflicts. Before fielding a weapon system, military and political leaders should think about how it can be used and should it be used in a certain manner. A strong and clear regulatory framework is needed. The use of automatic processing of plans and orders (automatic control) needs a policy control. Autonomous machines need some level of human control and accountability. Imagine what could happen if a system, like HAL 9000 or the War Games supercomputer, could make an autonomous decision. Some fictional stories have imagined a dystopian future where machine intelligence increases and surpasses human intelligence until machines exert control over humans. As Freedman concludes in The Future of War, most claims from the military futurists are wrong, but they remain influential nonetheless. The tendency of humans is to give more responsibility to machines in collaborative systems. In the future, automatic design and configuration of military operations will be entrusted more and more to the machines. Given human nature, if we recognize the autonomy of machines, we cannot expect anything better from them than the behavior of their creators. So why should we expect a machine to ‘do the right thing’? In the light of what has been discussed here, it could be argued that some military applications of EDTs may jeopardize ...
No Comments.