7

Texas Man Used AirTag to Track and Kill Suspected Truck Thief

 1 year ago
source link: https://www.vice.com/en/article/xgwzv3/texas-man-used-airtag-to-track-and-kill-suspected-truck-thief
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
neoserver,ios ssh client

Texas Man Used AirTag to Track and Kill Suspected Truck Thief 

Texas Man Used AirTag to Track and Kill Suspected Truck Thief

The man tracked someone he suspected of stealing his truck to a shopping mall parking lot and shot him in the head, police say.
April 3, 2023, 3:53pm
An Airtag on a keychain. Getty Images
Getty Images

A San Antonio man allegedly used an Apple Airtag tracking device to locate his stolen truck and kill the suspected thief. 

According to police reports, he used the Airtag to find the truck in a shopping center parking lot, and shot the man he suspected of stealing it in the head.

The shooter isn’t expected to face charges, according to local news outlet My San Antonio. The man reported his truck as missing to the San Antonio Police Department, but didn’t wait for police to respond before tracking and killing the suspect himself.

Advertisement

The suspected car thief, Andrew John Herrera, 44, died “from a gunshot wound to the head,” a medical examiner told news outlet KSAT. As of Saturday, news reports say, police hadn’t determined whether Herrera was armed. 

Apple introduced Airtags in 2021, as a way to keep track of belongings like keys or luggage. Since then, people have used the coin-sized location tracking devices to stalk and harass women, often by hiding them in cars. Cops and courts have frequently mishandled these cases. 

SAPD spokesperson Nick Soliz told My San Antonio that the man whose car was allegedly stolen told police he “believed the suspected thief pulled out a gun which prompted ‘a firefight.’” Police said that they believed only the man who confronted the suspected thief fired any shots. 

Texas’ laws state that a person can use deadly force against another person in cases of “unlawful interference” with their property. 

The San Antonio Police Department did not immediately respond to Motherboard’s request for comment. 

Tagged:texas

ORIGINAL REPORTING ON EVERYTHING THAT MATTERS IN YOUR INBOX.

Your Email:

By signing up, you agree to the Terms of Use and Privacy Policy & to receive electronic communications from Vice Media Group, which may include marketing promotions, advertisements and sponsored content.

GirlsDoPorn Founder and FBI Most Wanted Fugitive Arrested in Spain

Michael James Pratt has been on the run, evading sex trafficking charges that may land him in prison for life.
December 23, 2022, 5:43pm
Pratt being led away by authorities.
Screenshot via Twitter

Michael James Pratt, the man responsible for organizing the elaborate sex trafficking scheme and website Girls Do Porn, which forced and coerced potentially hundreds of women into having sex on camera, was arrested in Spain on Friday.

Spain’s National Police Corps released a video of Pratt’s arrest on Twitter. 

The tweet doesn’t name Pratt, but says that the man in the video is a New Zealander on the FBI’s top 10 most wanted list; Pratt made it to the top 10 in September, with a $100,000 reward for information leading to his arrest. According to news outlet El Español, Pratt was hiding in Madrid, and checked into a hotel in the center of the city on Wednesday using one of his many false identities.  

Advertisement

The owners and operators of Girls Do Porn were charged with federal counts of sex trafficking by force, fraud, and coercion in 2019. Pratt and his associates filmed hundreds of sex scenes with young women between 2012 and 2019, luring them to their San Diego-based shooting under the guise of “modeling” gigs and fast cash. Many of the women testified that they weren’t told until they arrived on location that it was a porn shoot; some said they were trapped in the rooms and brutally abused, portions of which were edited out of the final videos.

Pratt and his co-conspirators also promised many of the women that their videos would only be seen by “private collectors” in faraway countries like New Zealand, and that their communities at home would never see their sex scenes. This was a lie; Girls Do Porn uploaded the videos almost immediately to the internet, to massive tube sites including Pornhub, which in many cases ruined their lives.

"Personally, I'm happy they found him. But I'm even happier for all of his victims,” Brian Holm, attorney for the victims in the civil trial, told Motherboard. “Bringing Pratt to justice will bring the closure they need to heal. Christmas came a bit early this year."

The FBI declared Pratt a wanted fugitive in early 2020; he’d fled the US in 2019, during a civil trial where he and his associates were accused by 22 women of fraud and coercion. In that trial, a state judge ruled that Pratt and his co-conspirators owed those women $12.7 million in damages. Many of his associates have pled guilty to federal charges and faced convictions since then, including the main performer Ruben Andre Garcia, who was sentenced to 20 years in jail by a federal court in California in 2021, cameraman Theodore "Teddy" Gyi, who pled guilty to counts of conspiracy to commit sex trafficking by force, fraud and coercion, and Pratt’s closest co-conspirator Matthew Isaac Wolfe, who also pled guilty to federal trafficking charges in 2022. 

“The defendant lied and tricked these women, made millions along the way, and left his co-conspirators to face justice while he fled,” Special Agent in Charge Suzanne Turner said in a 2021 press release asking for the public’s help in locating Pratt. “Michael James Pratt is a danger to society, regardless of where he is, and is likely still victimizing people while on the run with his continued lies and false promises.”

Advertisement

San Francisco Decides Killer Police Robots Are Not a Great Idea, Actually

“We should be working on ways to decrease the use of force by local law enforcement, not giving them new tools to kill people.”
December 7, 2022, 2:24pm
Police tape
Anadolu Agency / Contributor via Getty

In an abrupt reversal amid public outcry, San Francisco’s Board of Supervisors has temporarily changed its decision to permit the city’s police department to kill people with robots, various news outlets reported.

Advertisement

“There have been more killings at the hands of police than any other year on record nationwide,” said District Supervisor Dean Preston in a statement. “We should be working on ways to decrease the use of force by local law enforcement, not giving them new tools to kill people.”

Last week, the Board voted 8-3 to approve a slate of policies regarding SFPD’s use of military-grade equipment, including using bomb-disposal robots to kill people like the Dallas police did in 2016 with a cornered shooting suspect. Initially, the Board did not want to include language allowing the police to kill people with robots, but the SFPD amended the language to explicitly allow it

It is not clear precisely why the Board changed its vote over the course of a week, but public outcry on the local, national, and international level seems to have played a major part. The Board’s vote was highly criticized by news outlets from around the world and from local privacy and civil rights groups that had already organized around another Board of Supervisors vote to permit the SFPD to access live video surveillance of private cameras. On Monday, the Electronic Frontier Foundation and 44 community groups signed a letter opposing the policy which argued “There is no basis to believe that robots toting explosives might be an exception to police overuse of deadly force. Using robots that are designed to disarm bombs to instead deliver them is a perfect example of this pattern of escalation, and of the militarization of the police force that concerns so many across the city.” The coalition also held a protest at City Hall on Monday.

However, the vote reversal is not permanent. According to the San Francisco Chronicle, the issue is being sent back to the Rules committee which will debate the topic further.

Matthew Guariglia, Policy Analyst for the Electronic Frontier Foundation, said in a statement the fight is not over. “Should the Rules Committee revisit the issue, the community must come together to stop this dangerous use of technology.”

Correction: This article previously stated Preston changed his vote. He voted against the proposal both times.

Advertisement
​A silhouette in front of a bunch of Twitter logos.
Getty Images

Members of Twitter’s Trust and Safety Council Not Sure Elon Musk Knows They Exist

As hate speech surges and the future of the platform seems to be in the balance, the people who helped advise Twitter on user safety say they don’t know what lies ahead.
November 2, 2022, 4:22pm

Members of Twitter’s Trust and Safety Council—a group of 100 organizations working on issues including harassment, content moderation, and suicide prevention on the platform—say that they’re unsure about their future, and if Elon Musk, who took over Twitter last week, even knows they exist.

“Now I feel like we're in a different universe,” Danielle Citron, vice president at the Cyber Civil Rights Initiative, told Motherboard. Citron said that despite one of the council’s regular meetings being on the calendar, her organization hasn’t heard from Twitter and Twitter staff seems to be “ghosting” them on updates.

Advertisement

Bloomberg reported on Monday that most people who work in Twitter’s Trust and Safety organization are locked out of their access to internal tools used for content moderation, and “are currently unable to alter or penalize accounts that break rules around misleading information, offensive posts, and hate speech,” citing anonymous sources familiar with the matter. Musk’s first act as new owner of Twitter was firing its top executives, including CEO Parag Agrawal, CFO Ned Segal, policy executive Vijaya Gadde, and company general counsel Sean Edgett. Vijaya worked closely with the council, according to Citron.

Musk has said that he wants to form his own content moderation council, with “widely diverse viewpoints.” Musk tweeted last week that “no major content decisions or account reinstatements will happen before that council convenes.” 

On Wednesday, he tweeted that he’d talked to people at the Anti-Defamation League, Color of Change, and the NAACP, among others, about “how Twitter will continue to combat hate & harassment & enforce its election integrity policies.” 

After Musk’s takeover of Twitter, the platform saw a surge of hate speech, according to Twitter’s head of safety and integrity, Yoel Roth.

Advertisement

Where all of this leaves the existing Trust and Safety Council is unclear. Twitter did not respond to a request for comment about the status of the council. 

“I sadly am not sure Elon Musk knows about the existence of the [Trust and Safety] council as of yet,” Alex Holmes, deputy CEO at the Diana Award Anti-Bullying Campaign and a member of the council, told Motherboard. “The Twitter Trust and Safety Council is a dedicated and passionate global group made up of unpaid representatives from NGOs, safety, hate speech and free speech experts who are there to be critical friends. We have often given our advice on upcoming products/tools, updates, safety issues. We are not an oversight board, and not involved in any moderation decisions, instead supporting a safe and healthy platform which is inclusive of all.” 

Twitter formed the Trust and Safety Council in 2016 as “a new and foundational part of our strategy to ensure that people feel safe expressing themselves on Twitter,” according to its announcement—with more than 40 organizations and experts from 13 regions making up its inaugural members. The council held its first annual summit the following year at Twitter’s San Francisco headquarters, where then-CEO Jack Dorsey participated and heard presentations from members. There are currently 100 organizations representing five different focus areas—content governance, suicide prevention, child sexual exploitation, online safety and harassment, and digital and human rights—listed on the council’s website.

Advertisement

“I felt very plugged-in, like I could always go to Vijaya,” Citron said. “It felt really responsive.”

Emma Llansó, director for the Center for Democracy and Technology’s Free Expression Project, and a member of the council, told Motherboard that her organization hasn’t heard anything from Twitter since late September.

“From my experience, the Council members are all really dedicated to trying to help Twitter be more responsive to abuse and more transparent and fair in how they enforce their policies,” Llansó said. “There's still a long way to go, but Twitter staff have made a continual effort to improve the experiences of its most vulnerable users. It's hard to tell exactly what Musk's plans are for trust and safety work at Twitter, but it's disconcerting that he talks about taking the company in a different direction.” 

Before his takeover of the company, Musk frequently complained about what he saw as the platform’s lack of “free speech,” but he has only defined his vision of free speech as “that which matches the law,” in a tweet in April. 

“I don't think he’s about free speech; I think he's about ‘free speech that I like,’” Citron said.

Twitter has always had major flaws in how it’s handled issues of privacy, safety, and user trust. It’s been widely criticized as reluctant to solve the issues of hate speech and trolls, while rolling out features no one asked for. The council itself accused Twitter of not listening or being responsive enough in 2019, in a letter to Dorsey obtained by Wired. But even with its existing issues, disbanding a group that’s been doing years of work in safety at a critical moment in the platform’s history would be a mistake, members of the council say. 

“It would be a shame to see the work and passion of this global group disbanded, and I am hopeful there is a way to continue to work with Twitter under new direction,” Holmes said.

“If Twitter dissolves the Council, I worry that could signal a retrenchment by Twitter, as far as seeking outside expertise, and a decision to deprioritize crucial trust and safety work,” Llansó said. “Twitter needs to have some process for engaging outside experts and perspectives in order to better inform its work.”

Advertisement
© 2023 VICE MEDIA GROUP

About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK