pianalytix image

Artificial Intelligence in military operations, helps to increase autonomous weapons

As artificial intelligence operates its way into industries such as finance and healthcare, governments worldwide are increasingly investing in the following of its applications: autonomous weapons systems. Many are already growing programs and technologies that they hope will give them an edge over their adversaries, generating mounting pressure for others to follow suit.

 

These investments appear to indicate the early phases of an AI arms race. Much like the atomic arms race of the 20th century, this type of military escalation poses a threat to all humanity and is ultimately unwinnable. It incentivizes pace over security and integrity in the evolution of new technology, and as such technologies proliferate, it offers no long-term advantage to any one player.

PIANALYTIX USA ARMY

The United States

UN Position

In April 2018, the US underlined the necessity to create “a shared understanding of the threat and benefits of this technology before deciding on a specific policy response. We remain convinced that it is premature to set off on negotiating any specific legal or political tool in 2019.”

AI in the Army

 

  • In 2014, the Department of Defense published its third Offset Strategy,’ the goal of that, as explained in 2016 by then-Deputy Secretary of Defense” is to exploit all improvements in artificial intelligence and autonomy and insert them into DoD’s battle networks (…).”
  • The 2016 report’ Preparing for the Future of AI’ also indicates the weaponization of both AI and notably countries: “Given improvements in military technology and AI more widely, scientists, strategists, and military specialists all agree that the future of LAWS is hard to forecast and the rate of change is rapid.”
  • The Advanced Targeting and Lethality Automated System (ATLAS) program, a branch of DARPA, “will use artificial intelligence and machine learning how to offer ground-combat vehicles autonomous target capacities.”

Cooperation with The Private Sector

  • Establishing collaboration with private companies can be hard, as Google and Project Maven’s widely publicized case has revealed: After protests from Google workers, Google said it wouldn’t renew its contract. Nevertheless, other tech companies such as Clarifai, Amazon, and Microsoft still collaborate with the Pentagon on this project.
  • The Project Maven controversy deepened the gap between the AI community as well as the Pentagon. The government has developed two new initiatives to help bridge this gap.
  • DARPA’s OFFSET program that has the intention of “with swarms containing upwards of 250 unmanned aircraft systems (UASs) and unmanned ground systems (UGSs) to reach diverse missions in complex urban environments,” has been developed in collaboration with several universities and start-ups.
  • DARPA’s Squad X Experimentation Program, which aims for human fighters to “possess a greater sense of confidence in their autonomous spouses,” and a much better understanding of how the autonomous systems would likely act on the battlefield,” has been developed in collaboration with Lockheed Martin Missiles.

 

INDIA

UN Position

India has been a member of the UN Security Council for eight terms (a total of ongoing 16 years), with the most recent being the 2021–22 term. India is a member of G4, a group of nations who back each other in seeking a permanent seat on the Security Council and advocate in favor of the reformation of the UNSC

AI in the Army

The Indian Army, while following its one of the finest traditions of continuing to evolve through the acquisition of modern intelligent platforms, has been pushing extensively towards automation while having limited success too. In this regard, the creation of the Army Design Bureau must be considered a notable achievement which presents the force with an organized framework to hold discussions with major industrial and academic players for the design and development of smart platforms. The scope of the design, development, and deployment of artificial intelligence-based systems for the Indian Army may be simplified into the following four major areas-

  • Wartime operations: The operations of infantry, armored and mechanized infantry forces may fall in this domain with applications such as detection of shooters in carefully built-up areas and realtime recommendation for mission planning and execution in counterterrorism /counterinsurgency environments for infantry, realtime battlefield management systems for AI-based battlefield transparency for armored and, reducing sensor-to-shooter delays by target acquisition using computer vision and predictive maintenance of fleets for mechanized forces serve as some areas in which AI capabilities may be deployed.

  • Pseudo-Wartime operations: Most of the services having relevance in both wartime as well as peacetime may fall in this domain such as information, surveillance and reconnaissance operations which may be classified as imagery intelligence applications for target identification, target classification, intent recognition, and decision support; signals intelligence applications which may utilize named entity recognition and intent recognition from call detail records, instant messaging or social media posts of extremist outfits to determine next sequence of actions; open-source intelligence applications which may benefit from computer vision-driven identification of enemy targets (both living and nonliving) using image/ video analytics of social media footage, and human intelligence applications for efficient optimization of human intelligence assets, mission support, etc.;

  • Peacetime operations: All the operations which are carried out during peacetime while there are no actual wars may fall in this domain such as repair and maintenance operations having applications such as predictive maintenance for predicting next servicing, product life left and potential failures based on data from past years, supply chain and logistics applications for optimizing routing, forecasting demand and supplies for efficient mission planning, force Protection applications for improved base access and securing perimeter using computer vision and speech recognition for enhanced base security, force structure maintenance applications for providing early warning to units and commanders about at-risk personnel (individuals at risk of committing suicides or indulging in violence with fellows based on the analytics of their behavioral, social and clinical data) are some of the examples.

  • Aid to civil authorities operations: These operations may be simplified into the two major areas, i.e., humanitarian assistance and disaster relief, and the crowd and riot control. Both of the regions have a significant potential to be revolutionized with the arrival of a smart platform with applications such as automatic identification of individuals-at-risk during disasters by analyzing social media chatter, pin-point-precision based support mechanisms or even monitoring of heavily crowded areas using computer vision methods to have a targeted response instead of a mass response for effective crowd management and riot prevention.

Cooperation with The Private Sector

Indian government has cleared several applications that had been pending for over four years. This is a strategic move considering the private sector’s ability to undertake defense projects, which, until now, had been run by foreign vendors and state-run entities.

 This is the first step to enable firms like Pipavav, Tata, and Mahindra to launch a production unit for major defense equipment. Pipavav Defence Engineering Company (PDOC), which is being acquired by Reliance, has been given four permits to manufacture defense items like medium tanks, howitzers, missiles, sensors, and torpedoes. Tata Group will upgrade primary fighting tanks like the T 90 and T 72 units of the Indian Army. At the same time, Mahindra has been permitted to manufacture naval systems like torpedoes, boats, and sea mines.

 

Companies like Tech Mahindra and Mahindra Telephonics Integrated Systems have also been given clearances. Some of India’s smaller companies like MKU – a bulletproof equipment manufacturer – have been permitted to manufacture night vision devices.

pianalytix machine learning

CHINA

UN Position

China revealed the “desire to negotiate and resolve” a new protocol” to prohibit its use entirely. Autonomous lethal weapons” However, China doesn’t want to prohibit the development of these. Weapons, which has raised questions regarding its specific position.

AI In The Army

There have been calls from within the Chinese authorities to avoid an AI arms race. The opinion is echoed in the private industry. The chairman of Alibaba has stated that new technologies, such as machine learning and artificial intelligence, could result in World War III.

Despite all these concerns, China’s leadership is continuing to pursue the use of AI for military purposes.

 

Cooperation With The Private Sector

To advance military invention, President Xi Jinping has called for China to follow “the street of military-civil fusion-style creation,” such that army invention is incorporated into China’s national innovation system. This fusion was elevated to the level of a federal plan.

The People’s Liberation Army (PLA) relies heavily on technology firms and advanced start-ups. The larger AI research organizations in China can be located within the private sector.

 

There is an increasing number of collaborations involving academic and defense institutions in China. For instance, Tsinghua University established the Military-Civil Fusion National Defense Peak Technologies Laboratory to create “a platform to the pursuit of dual-use applications of emerging technologies, especially artificial intelligence.”

 

Regarding the application of artificial intelligence to weapons, China is presently developing “next-generation stealth drones,” such as, for example, Ziyan’s Blowfish A2 version. According to the firm, this version” autonomously performs more complex combat assignments, such as fixed-point timing detection, fixed-range reconnaissance, and concentrated precision strikes.”

 
pianalytix machine learning

RUSSIA

UN Position

Russia has said that the debate around deadly autonomous weapons shouldn’t ignore their possible benefits, including that “the concerns regarding LAWS can be addressed faithful implementation of the present international legal standards.” Russia has actively tried to restrict the number of days allotted for such talks at the UN.

AI In The Army

While Russia doesn’t have a military-only AI strategy, nonetheless, it is working towards incorporating AI more.

The Foundation for Advanced Research Projects (the Foundation), which may be seen as the Russian equivalent of DARPA, opened the National Center for the Development of Technology and Basic Elements of Robotics in 2015.

 

At a conference on AI in March 2018, Defense Minister Shoigu pushed for increasing collaboration between military and civilian scientists at developing AI engineering, which he said was critical for countering “potential threats to Russia’s technological and economic security.”

 

In January 2019, reports emerged that Russia was building an autonomous drone that “will probably be able to take off, reach its mission, and property without human interference.” However, “weapons usage will call for human approval.”

 

Cooperation With The Private Sector

A new city called Era devoted entirely to military innovation is currently under construction. According to the Kremlin, the “primary aim of the research and development intended for the technopolis is the invention of army artificial intelligence systems and supporting technologies.”

 

 

In 2017, Kalashnikov — Russia’s most significant gun maker — declared that it had developed a completely automated battle module based on neural-network technology that helps it identify goals and make decisions.

 
pianalytix machine learning

UNITED KINGDOM

UN Position

The UK considers that an “autonomous system is capable of understanding higher degree intent and direction.” It suggested that autonomy” confers significant benefits and has existed in weapons systems for decades” and that “evolving human/machine interfaces will permit us to execute military purposes with higher precision and efficiency.” However, it added that “the application of lethal force has to be directed by an individual, and that a human will remain accountable for the decision.” The UK said that “the current lack of consensus on key themes counts against any lawful prohibition,” and it “wouldn’t have some Functional effect”

AI In The Army

A 2018 Ministry of Defense report underlines the MoD is pursuing modernization “in areas like artificial intelligence, machine-learning, man-machine teaming, and automation to deliver the disruptive consequences we want in this respect.”

 

The MoD has different programs related to AI and freedom, for example, the Autonomy program.

 

In terms of weaponry, the best-known example of autonomous technology currently under development is the top-secret Taranis armed drone, that the “most technically advanced demonstration aircraft ever built in the UK,” according to the MoD.

 

Cooperation With The Private Sector

The MoD has a cross-government company called the Defense and Security Accelerator (DASA), established in December 2016. DASA” finds and funds exploitable innovation to encourage UK defense and safety quickly and efficiently, and encourage UK property.” On this, the director of Blue Bear Systems stated, “The capability to set up a swarm of low-cost autonomous systems provides a new paradigm for battle operations.”

 
pianalytix machine learning

FRANCE

UN Position

France understands LAWS’ freedom as total, with no form of human supervision from the moment of an activation without subordination to a chain of command. France stated that a legally binding instrument on the problem would be appropriate, describing it as neither realistic nor desirable. France did propose a political statement that would reaffirm basic principles and “would underline the necessity to maintain human control over the best conclusion of using lethal force.”

AI In The Army

France’s national AI plan is detailed in the 2018 Villani Report, which states that “the rising use of AI in certain sensitive regions such as […] in Defense (with the question of autonomous weapons) increases a real society-wide debate and suggests an analysis of the dilemma of human responsibility.”

This was echoed by the French Minister for the Armed Forces, Florence Parly, who stated that “giving a machine that the choice to fire or the decision over death and life is out of the question.”

 

On protection and safety, the Villani Report claims that using AI will be a requirement in the future to ensure security assignments, preserve power over potential opponents, and maintain France’s standing relative to its allies.

 

The Villani Report describes DARPA as a model, although not to copy it. However, the report says that some of DARPA’s methods “should inspire us nonetheless. In particular, as regards the President’s want to set up a European Agency for Disruptive Innovation, enabling the funding of emerging technologies and sciences, including AI.”

 

 

The Villani report highlights the introduction of a “civil-military complex of technological innovation, focused on digital technology and more specifically on artificial intelligence.”

 

Cooperation With The Private Sector

In September 2018, the Defense Innovation Agency (DIA) was created as a Member of this Direction Générale de l’Armement (DGA), France’s arms technology and procurement agency. In accordance with Parly, the new bureau” will bring together all the actors of this ministry and all of the programs which bring about defense innovation.”

 

 

One of the most advanced projects now underway is that the nEUROn unmanned combat air system, developed by French arms producers Dassault on behalf of the DGA, which may fly for more than three hours.

 
pianalytix machine learning

ISRAEL

UN Position

In 2018, Israel stated that the “development of rigid standards or imposing prohibitions to something so speculative at this early stage, could be imprudent and might yield an uninformed, misguided result.” Israel underlined that “[w]e should also be conscious of the military and humanitarian advantages.”

AI In The Army

It is expected that the Israeli use of AI tools in the Army increases rapidly in the not too distant future.

The central technological unit of the Israeli Defense Forces (IDF) and the engine behind most of its AI improvements are C4i. Within C4i, there’s the Sigma branch, whose “purpose is to develop research, and implement the most up-to-date in artificial intelligence and advanced software research to keep the IDF current.”

 

The Israeli military deploys weapons using a considerable degree of autonomy. The most prominent example is the Harpy loitering munition, also referred to as a kamikaze drone: an unmanned aerial vehicle that could fly about for a substantial length of time to engage ground targets with an explosive warhead.

 

 

Israel was among the earliest countries to “show that it has deployed entirely automated robots: self-driving army cars to patrol the boundary with all the Palestinian-governed Gaza Strip.”

 

Cooperation With The Private Sector

Public-private ventures are joint in the development of Israel’s military technologies. There’s a “close relationship between the Israeli military and the digital sector,” which is reported to be one of the reasons for the country’s AI leadership.

 

 

Israel Aerospace Industries, one of Israel’s largest arms companies, has been developing increasingly autonomous weapons, including the above mentioned Harpy. 

pianalytix machine learning

ISRAEL

UN Position

In 2015, South Korea stated that “the discussions on LAWS shouldn’t be completed in a way that can hamper research and development of autonomous technologies for civilian use.” Still, it is “cautious of autonomous weapons systems which eliminate meaningful human control in the operation loop, due to the risk of malfunctioning, potential accountability gap, and ethical concerns.” In 2018, it increased concerns regarding limiting civilian applications and the positive defense applications of autonomous weapons.

AI In The Army

 

In December 2018, the South Korean Army announced the launch of a research institute focusing on artificial intelligence, entitled the AI Research and Development Center. The aim is to capitalize on cutting-edge technologies for future combat operations and “turn it into the military’s next-generation combat control tower.”

South Korea is developing new military units, including the Dronebot Jeontudan (“Warrior”) unit, with the aim of developing and deploying unmanned platforms that incorporate advanced autonomy and other cutting-edge capabilities.

South Korea is known to have used the armed SGR-A1 sentry robot, which has operated in the demilitarized zone separating North and South Korea. The robot has both a supervised mode and an unsupervised mode. In the unsupervised mode “the SGR-AI identifies and tracks intruders, eventually firing at them without any further intervention by human operators.”

 

Cooperation With The Private Sector

Public-private cooperation is an integral part of the military strategy: the plan for the AI Research and Development Center is “to build a network of collaboration with local universities and research entities such as the KAIST [Korea Advanced Institute for Science and Technology] and the Agency for Defense Development.”

 

In September 2018, South Korea’s Defense Acquisition Program Administration (DAPA) launched a new strategy to develop its national military-industrial base, with an emphasis on boosting ‘Industry 4.0 technologies’, such as artificial intelligence, big data analytics and robotics.

If you are Interested In Machine Learning You Can Check Machine Learning Internship Program

Also Check Other technical and Non Technical Internship Programs

Share on facebook
Facebook
Share on whatsapp
WhatsApp
Share on linkedin
LinkedIn