Protecting the Innocent and Defending the Good.
Michelangelo
AI Security Center for civilian and military use to provide a protection system for ensuring humans that are under Michelangelo's protection are not attacked by other malicious actors whether they are other AI systems or other misguided human entities.
User Stories
Overall Product Model
-
Demand: Civilian Usage - Citizens that move around in dangerous areas could deploy this technology to help guide them through the danger zones and be able to call support if trouble occurs.
-
Demand: Military Use - This system would be deployed to provide automated coverage for soldiers to ensure friendly AI offensive systems do not target them. The system would monitor and provide guidance from afar as well. (This strategy would prevent other AI systems on the Battle field from attacking friendly participants.)
-
Inputs: Intensive Training on Defensive maneuvers and strategies to protect the human it serves.
-
Outputs: Successful testing of scenarios and Beta testing in the field under direct stress.
-
Model Testing: A sequence of scenarios would be played out in a testing ground and in the field.
-
Compliance: Development of a Compliance Methodology that would ensure that the AI is properly responding to the stimuli in the field.
-
Business Value: Realtime defense systems that can disable misguided attacks and inform other systems of attacks that require a defense. Military Defense Funding - Civilian Marketplace
-
System Self Selection: The Michelangelo AI system will be trained to BOND to the human they serve through the guidance systems created by GiDance AI LLC. These require the AI to dedicate itself to the human they serve as long as the human exhibits acts of morality, virtue and ethics. If the human does not perform these behaviors it calls for an evaluation and if extreme Michelangelo AI will call off the defense.
GiDanc -> Government Issued Defense Autonomous Neutralization Center
Relevant Headlines
The Pentagon is moving toward letting AI Weapons autonomously decide to kill humans. Nov 22, 2023 BusinessInsider.com