[Fox News] AI-powered home security system strikes back with paintballs and tear gas

A company from Slovenia, called PaintCam, is shaking things up in the security world

It has come up with this wild new gadget, the PaintCam Eve. 

It’s not just another security camera watching over your house. This thing packs a punch with paintball and tear gas projectiles to really give intruders a surprise they won’t soon forget.

CLICK TO GET KURT’S FREE CYBERGUY NEWSLETTER WITH SECURITY ALERTS, QUICK VIDEO TIPS, TECH REVIEWS AND EASY HOW-TO’S TO MAKE YOU SMARTER

The heart of Eve’s capability lies in its sophisticated computer vision technology. It can identify human faces and animals even in low-light conditions, distinguishing between friends and foes.

WHAT IS ARTIFICIAL INTELLIGENCE (AI)?

The system, which comes in 3 models, Eve, Eve +, and Eve Pro, allows homeowners to categorize visitors via an app interface — making decisions about who is welcome and who is not. But the most intriguing feature? When Eve detects an intruder, it issues a stern warning, and if not heeded, it proceeds to launch paintballs or tear gas.

MORE: CREEPY TOOL LETS CRIMINAL HACKERS ACCESS YOUR HOME VIDEO CAMERAS

PaintCam does offer users a significant degree of control. The system alerts the homeowner when an unknown person is detected in the company of someone known, asking whether to “take the shot” or not.

This feature places a heavy responsibility on the user, turning home security into a more interactive and potentially morally complex activity. How users will navigate these choices, especially in high-pressure situations, is yet to be seen.

MORE: 6 BEST OUTDOOR SECURITY CAMERAS 

While the prospect of a security camera that can “shoot” at intruders may sound appealing to some, it raises significant ethical and legal questions. The use of force, even non-lethal, by an autonomous system could lead to unintended consequences.

For instance, what happens if the system mistakenly identifies a neighbor or a child retrieving a lost toy as a threat? The legal ramifications of such scenarios remain unclear, making Eve a subject of debate among security experts and civil rights advocates alike.

MORE: SNEAKY LIGHTBULB SECURITY CAMERAS ARE THE NEXT BIG THING IN HOME SECURITY 

The global home security market is set to garner a market size of an estimated $106.3 billion by 2030, indicating a vast potential customer base for innovative products like Eve. However, its market success will depend not only on consumer interest but also on navigating the legal landscape and public perception challenges that such a confrontational device presents.

PaintCam launched Eve with a Kickstarter campaign on Tuesday. At the time of publishing, the exact cost of the security device has not yet been disclosed. You can sign up for notifications about the product on PaintCam’s official website, as well as on the Kickstarter product page. 

GET FOX BUSINESS ON THE GO BY CLICKING HERE

The company posted this mission statement on its site:

“We offer innovative solutions that seamlessly integrate with your environment, establishing both passive presence and active deterrence. Our unwavering commitment is to make the world a safer place, not by fortifying intimidation strongholds, but by delivering intelligent, adaptable, and elegant security options.”

This innovation invites us to reflect on the nature of home security. Are we moving towards a future where our homes are not just passively protected but actively defended by machines? And at what point does the integration of such technology in our daily lives challenge our notions of privacy and safety? Only time will tell whether systems like Eve will become the new norm or remain a curious footnote in the evolution of home security technologies.

Considering the potential for mistakes, do you feel comfortable with the idea of a security system like PaintCam Eve that can autonomously deploy paintballs or tear gas? Let us know by writing us at Cyberguy.com/Contact

For more of my tech tips & security alerts, subscribe to my free CyberGuy Report Newsletter by heading to Cyberguy.com/Newsletter

Ask Kurt a question or let us know what stories you’d like us to cover.

Answers to the most-asked CyberGuy questions:

Copyright 2024 CyberGuy.com. All rights reserved.

Read More 

[Fox News] The AI camera stripping away privacy in the blink of an eye

It’s natural to be leery regarding the ways in which people may use artificial intelligence to cause problems for society in the near future. On a personal level, you may be concerned about a future where artificial intelligence takes your job or creates a Terminator that comes back in time to try to eliminate a younger you. (We admittedly might be overthinking that one.)

One fear regarding AI on a personal level that you should know about because it’s very much in the present is the creation of deepfake photos, including those that strip you of the most basic of privacy rights: the right to protect images of your body.

Two German artists recently created a camera called NUCA that uses AI to create deepfake photos of subjects by stripping away their clothing. The automated removal of the photo subject’s clothing occurs in close to real-time, speeding up the creepy factor exponentially.

CLICK TO GET KURT’S FREE CYBERGUY NEWSLETTER WITH SECURITY ALERTS, QUICK VIDEO TIPS, TECH REVIEWS AND EASY HOW-TO’S TO MAKE YOU SMARTER

The two German artists, Mathias Vef and Benedikt Groß, decided to create the camera to show the implications of AI’s rapid advancements. The pair were trying to think of the worst possible uses of AI to affect someone’s privacy, and they realized that the technology needed to create a camera like NUCA was already possible.

WHAT IS ARTIFICIAL INTELLIGENCE (AI)?

The two artists hope that people will consider the dangers of continuing to develop AI technologies like this, which could eliminate the expectation of privacy. They hope it will spark debates about the direction of AI.

MORE: HOW SCAMMERS HAVE SUNK TO A NEW LOW WITH AN AI OBITUARY SCAM TARGETING THE GRIEVING

The German artists used 3D design and print software to create the lenses and the shell for controlling the camera. It then uses a smartphone on the inside of the shell that handles the image capture. NUCA passes the photo to the cloud for the application of AI that removes the subject’s clothing.

Of course, NUCA is not actually creating a photo of your naked body. Instead, it analyzes your gender, face, age and other aspects of your body shape to develop a replication of what AI believes your naked body would look like.

GET FOX BUSINESS ON THE GO BY CLICKING HERE

MORE: ARE AI DEEPFAKES THE END OF ACTING AS WE KNOW IT 

Deepfake nude photos, usually of celebrities, have been around for a long time on pornography websites, but the photos from NUCA require almost no technical know-how. 

Even more frightening, NUCA is able to perform the process within about 10 seconds. The immediacy of the creation of the deepfake nude photo is what sets NUCA apart from other fake nude photos that typically require quite a bit of editing skill and time. 

MORE: AI WORM EXPOSES SECURITY FLAWS IN AI TOOLS LIKE CHATGPT

Bottom line: Anyone could use the technology that’s found with NUCA to create a deepfake nude photo of almost anyone else within several seconds. NUCA doesn’t ask for permission to remove your clothing in the photo.

It’s worth emphasizing again that the two artists have no plans to allow others to use NUCA for commercial gain. They will showcase its capabilities in late June at an art exhibition in Berlin all in an effort to spark public debate.

However, the next people who develop a similar technology may choose to use it in a far different way, such as to potentially blackmail people by threatening to release these fake nude photos that other people won’t necessarily know are fake.

If it feels like AI is expanding wildly in dozens of different directions all at once, you aren’t all that far off. Some of those directions will be helpful for society, but others are downright terrifying. As deepfakes continue to look more and more realistic, the line between a fake digital world and reality will become increasingly difficult to discern. Guarding our privacy will almost certainly be more and more difficult as AI strips away our safeguards … and, potentially, even our clothing. 

Are you concerned about AI-created deepfake photos and videos affecting you personally? What safeguards should exist around the use of AI? Let us know by writing us at Cyberguy.com/Contact

For more of my tech tips & security alerts, subscribe to my free CyberGuy Report Newsletter by heading to Cyberguy.com/Newsletter

Ask Kurt a question or let us know what stories you’d like us to cover.

Answers to the most-asked CyberGuy questions:

Copyright 2024 CyberGuy.com. All rights reserved.

Read More