Security

A deceitful ‘Doctor’ in the Mac App Store

Patrick Wardle, Objective-See:

You probably trust applications in the Official Mac App Store. And why wouldn’t you?

Yup.

However, it’s questionable whether these statements actually hold true, as one of the top grossing applications in the Mac App Store surreptitiously exfiltrates highly sensitive user information to a (Chinese?) developer. Though Apple was contacted a month ago, and promised to investigate, the application remains available in Mac App Store even today.

Read the post for all the details (good work from Patrick Wardle and Twitter user @privacyis1st) but here’s a good summary from John Gruber, in a Daring Fireball post called The Curious Case of Adware Doctor and the Mac App Store:

What a bizarre story this is. Adware Doctor was a $4.99 app in the Mac App Store from a developer supposedly named Yongming Zhang. The app purported to protect your browser from adware by removing browser extensions, cookies, and caches. It was a surprisingly popular app, ranking first in the Utilities category and fourth overall among paid apps, alongside stalwarts like Logic Pro X and Final Cut Pro X.

Turns out, among other things, Adware Doctor was collecting your web browser history from Chrome, Firefox, and Safari, and uploading them to a server in China. Whatever the intention of this was, it’s a privacy debacle, obviously. This behavior was first discovered by someone who goes by the Twitter handle Privacy 1st, and reported to Apple on August 12. Early today, security researcher Patrick Wardle published a detailed technical analysis of the app. Wired, TechCrunch, and other publications jumped on the story, and by 9 am PT, Apple had pulled the app from the App Store.

So the issue was reported on August 12th but not taken down until 26 days later, on September 7th.

But wait, there’s more.

Guilherme Rambo, in a 9to5Mac post titled Additional Mac App Store apps caught stealing and uploading browser history:

When you give an app access to your home directory on macOS, even if it’s an app from the Mac App Store, you should think twice about doing it. It looks like we’re seeing a trend of Mac App Store apps that convince users to give them access to their home directory with some promise such as virus scanning or cleaning up caches, when the true reason behind it is to gather user data – especially browsing history – and upload it to their analytics servers.

Today, we’re talking specifically about the apps distributed by a developer who claims to be “Trend Micro, Inc.”, which include Dr. Unarchiver, Dr. Cleaner and others.

These apps have been removed from the Mac App Store.

This raises some serious issues. Is this the tip of the iceberg? Are there other apps in the Mac App Store that do the same thing, but are not yet discovered? Is this just one technique of many? And what about the iOS App Store?

I am very reluctant to run any app on my Mac unless I either know and trust the developer or the app comes from the Mac App Store. The Mac App Store is a trusted source. If that trust is broken, either on the Mac or iOS, that’s a real problem for Apple.

I’m hoping we see some formal response from Apple, with some sense that they are aware of the issues involved and have new steps in place to root out existing apps that use this “give us access to your Home directory” (or similar) approach, steps that will prevent this issue from recurring.

Malware takes advantage of specific Safari setting

Patrick Wardle, Objective-See (via Michael Tsai):

Once the target is visits our malicious website, we trigger the download of an archive (.zip) file that contains our malicious application. If the Mac user is using Safari, the achieve will be automatically unzipped, as Apple thinks it’s wise to automatically open “safe” files.

This is a pretty long read, but it all comes down to the way macOS Safari treats downloaded files, and one specific setting in Safari Preferences:

Preferences > General > Open “safe” files after downloading

Here’s a picture of that setting, a checkbox down at the bottom of the General tab. I’ve unchecked mine. You might want to take a look at yours.

Key to all this is the word archives at the end. That includes .zip files, which can contain, well, bad stuff.

Read the linked article. As I said, I’ve unchecked my setting, have not yet encountered a problem set that way. This as bad as it seems?

UPDATE: This issue has, apparently, been around since the dawn of time, but that the default is supposed to be unchecked. I just unboxed a new Mac, factory settings, no migration, and the setting was on/checked. Public version of High Sierra.

Ingenious BMW theft attempt

Marc Rooding, Medium:

During that night, my girlfriend and I were fast asleep, when at 03:45 the doorbell rang. We looked at each other dazed. I got out of bed and attempted to journey downstairs in my boxers when the doorbell rang again. Before opening the door I went into the living room to gaze out of the window. A police car with 2 policemen was standing in front of our house. I opened the door and was welcomed with the question whether I owned a BMW with a specific license plate. They said that a car burglary had taken place.

Read the story. Short version, the thieves tried a new approach that might signal a new wave of auto theft techniques. If nothing else, this will give you something to be aware of, if your car is ever broken into, but nothing appears to be taken.

Apple reassures customers after Australian media reports hack by teen

If you haven’t heard about this story, here’s yesterday’s Loop post. Shocking stuff.

Apple’s reassuring response:

An Apple spokesman said the company’s information security personnel “discovered the unauthorized access, contained it, and reported the incident to law enforcement” without commenting further on the specifics of the case.

“We … want to assure our customers that at no point during this incident was their personal data compromised,” the spokesman said.

That last is so good to know.

Make your Venmo transactions private. Seriously.

FastCompany:

When you think of companies that violate your privacy online, chances are Facebook is one of the first names that come to mind. But there’s another common app that should: Venmo, the PayPal-owned peer-to-peer payment app that lets people send money to friends, family, and anyone else you need to pay (including, for instance, drug dealers). The payments you make on the app, complete with a cute little emoji or note, are public by default, which means that many users don’t realize just how easy it is for the rest of the world to observe the $35 billion in transactions made on Venmo.

When I first read this, I was shocked. This is such a basic breach of user etiquette, so egregious, I struggled to believe it was true.

But I popped open my Venmo app, jumped over to Settings > Privacy and, sure enough, my Default Privacy Setting was set to Public (Visible to everyone on the Internet).

Why, Venmo? Why would you ever think that the transfer of money would be something I’d want to share with the world? What possible use case is that?

And even if there is a case for public visibility, why make it the default?

The mind reels.

How Smart TVs track pixels, communicate what you are watching to other devices, all to serve up ads

New York Times:

In recent years, data companies have harnessed new technology to immediately identify what people are watching on internet-connected TVs, then using that information to send targeted advertisements to other devices in their homes.

And:

Once enabled, Samba TV can track nearly everything that appears on the TV on a second-by-second basis, essentially reading pixels to identify network shows and ads, as well as programs on Netflix and HBO and even video games played on the TV. Samba TV has even offered advertisers the ability to base their targeting on whether people watch conservative or liberal media outlets and which party’s presidential debate they watched.

You might think this is nothing new, but this isn’t simply translating the current time and the channel on the screen to know what show is playing. This is actually analyzing the pixels on the screen to suss out the nature of the content. They can tell what video game you are playing, or watch you watching home videos, harvesting data and drawing conclusions all the while.

Have we learned nothing?

[H/T Robert Walter]

What is browser fingerprinting and how does it work?

Electronic Frontier Foundation:

When a site you visit uses browser fingerprinting, it can learn enough information about your browser to uniquely distinguish you from all the other visitors to that site. Browser fingerprinting can be used to track users just as cookies do, but using much more subtle and hard-to-control techniques.

And:

By using browser fingerprinting to piece together information about your browser and your actions online, trackers can covertly identify users over time, track them across websites, and build an advertising profile of them. The information that browser fingerprinting reveals typically includes a mixture of HTTP headers (which are delivered as a normal part of every web request) and properties that can be learned about the browser using JavaScript code: your time zone, system fonts, screen resolution, which plugins you have installed, and what platform your browser is running on.

And:

When stitched together, these individual properties tell a unique story about your browser and the details of your browsing interactions. For instance, yours is likely the only browser on central European time with cookies enabled that has exactly your set of system fonts, screen resolution, plugins, and graphics card.

The linked/quoted article is long and detailed, an enlightening read. But the bits about browser fingerprinting are incredibly important. And this is as good an explanation as I’ve seen.

At WWDC, Apple declared war on browser fingerprinting and related techniques. From Apple’s Mojave press release:

As with all Apple software updates, enhanced privacy and security remain a top priority in macOS Mojave. In Safari, enhanced Intelligent Tracking Prevention helps block social media “Like” or “Share” buttons and comment widgets from tracking users without permission. Safari now also presents simplified system information when users browse the web, preventing them from being tracked based on their system configuration.

And that’s a good thing.

Apple, Grayshift whac-a-mole

From this New York Times article:

Apple said it was planning an iPhone software update that would effectively disable the phone’s charging and data port — the opening where users plug in headphones, power cables and adapters — an hour after the phone is locked. While a phone can still be charged, a person would first need to enter the phone’s password to transfer data to or from the device using the port.

And from the Elcomsoft blog:

In the second beta of 11.4.1 released just days ago, activating the SOS mode enables USB restrictions, too. This feature was not present in the first 11.4.1 beta (and it is not part of any other version of iOS including iOS 12 beta). In all other versions of iOS, the SOS mode just disables Touch/Face ID. The SOS feature in iOS 11.4.1 beta 2 makes your iPhone behave exactly like if you did not unlock it for more than an hour, effectively blocking all USB communications until you unlock the device (with a passcode, as Touch ID/Face ID would be also disabled).

And this from Motherboard, with the title Cops Are Confident iPhone Hackers Have Found a Workaround to Apple’s New Security Feature:

“Grayshift has gone to great lengths to future proof their technology and stated that they have already defeated this security feature in the beta build. Additionally, the GrayKey has built in future capabilities that will begin to be leveraged as time goes on,” a June email from a forensic expert who planned to meet with Grayshift, and seen by Motherboard, reads, although it is unclear from the email itself how much of this may be marketing bluff.

And:

A second person, responding to the first email, said that Grayshift addressed USB Restricted Mode in a webinar several weeks ago.

My instinct is that this is, indeed, a marketing bluff. But one without teeth if it doesn’t work.

Whac-a-mole (note the spelling, a trademark thing, I think).

The most sophisticated piece of software/code ever written

This story was at the top of hacker news this morning. It’s a fascinating read, even if you know nothing about programming. And it’s a riveting true story. I’m convinced this would make a fantastic movie.

I didn’t quote any of it because it’d be hard to do so without including spoilers. But read it to the end. Fantastic.

Android has it, iOS needs it: Copy two-factor codes from text message

Jacob Kastrenakes, The Verge:

If you use two-factor authentication to secure your accounts, you’re probably used to this process: type in your password, wait for a text messaged code to arrive, memorize the code, and then type it back into the login prompt. It’s a bit of a pain.

Absolutely. Happens a lot. And this describes the process pretty well. Android has a fix:

In the new update, Messages will detect if you’re receiving a two-factor authentication code. When it does, it’ll add an option to the notification to copy the code, saving a step.

This is a step in the right direction. When a two-factor text is received, a copy button appears at the same time. Tap it, then paste it into the prompt.

It’d be nice to see this in iOS. But even better, it’d be nice to avoid the codes in the first place. The purpose of the codes is to prove that you have access to a verifying device. The codes themselves exist purely to give you a way to “move” the verification from the second device back to the original.

But iOS already does such an excellent job communicating between devices. I can copy on my iPhone, paste on my Mac, for example. And if the code is coming in on the same device that made the request, well that’s even easier.

What I’m suggesting is that Apple/Google work to create a verification service that eliminates all the friction. If I request a code on my Mac, popup a verification text message on my iPhone and, worst case, just make me tap “Yep” on an alert to verify the code, or “Nope” to let them know I didn’t make the request.

No reason for me to copy/paste or type in a number. Tap “Yep” and I’m in. Let the verification handshake happen in the background. Any reason this can’t be done?

iOS 11.4 adds USB Restricted Mode, port becomes power-only after 7 days without login

ElcomSoft blog:

In the iOS 11.4 Beta, Apple introduced a new called USB Restricted Mode. In fact, the feature made its first appearance in the iOS 11.3 Beta, but was later removed from the final release. This is how it works:

“To improve security, for a locked iOS device to communicate with USB accessories you must connect an accessory via lightning connector to the device while unlocked – or enter your device passcode while connected – at least once a week.”

And:

In other words, law enforcement will have at most 7 days from the time the device was last unlocked to perform the extraction using any known forensic techniques, be it logical acquisition or passcode recovery via GreyKey or other services.

It will be interesting to see if this mode survives through to the actual public release of 11.4. A chess move. Will the GreyKey folks have a follow-up? Or will all those $30K GreyKey devices become useless against updated phones?

Twitter urges users to change their password after bug stored passwords “unmasked”

Twitter blog:

When you set a password for your Twitter account, we use technology that masks it so no one at the company can see it. We recently identified a bug that stored passwords unmasked in an internal log. We have fixed the bug, and our investigation shows no indication of breach or misuse by anyone.

And:

We mask passwords through a process called hashing using a function known as bcrypt, which replaces the actual password with a random set of numbers and letters that are stored in Twitter’s system. This allows our systems to validate your account credentials without revealing your password. This is an industry standard.

Due to a bug, passwords were written to an internal log before completing the hashing process. We found this error ourselves, removed the passwords, and are implementing plans to prevent this bug from happening again.

This seems like a pretty major slip-up. The way I’m reading this, somewhere internal to Twitter, your password was stored “unmasked”. And to me, that means in the clear, in plain-text. Am I misreading this?

No matter. Go to Twitter Settings and change your password.

The best replacements for Apple’s defunct AirPort accessories

Chance Miller, 9to5Mac, lays out his take on the best router and mesh alternatives, now that Apple has end-of-life’d their AirPort line.

Before you dig into Chance’s list, take a moment to read through the list of “features to look for” in Apple’s Choosing a Wi-Fi router to use with Apple devices support note.

My biggest concern about a third party solution is trust. If I buy an Apple branded product, I trust that there’s no malware embedded in the firmware/software. I trust that if a vulnerability is found, it will be patched quickly and that patch will make its way onto my device pretty quickly. I trust that if I do run into a problem with that device, I can turn to Apple, an Apple Store, or to the thriving and friendly on-line community to help solve it.

Apple selling a product does not bring the trust of an Apple-branded product. One case in point, the LG UltraFine 5K Display and the Wi-Fi interference problem. The problem was fixed, but the product was sold by Apple as it exited the display business.

My home network is the weakest point in my on-line security and the router the focal point for attempts to break in. Choosing a router I can trust is a critical decision. I hate that Apple has left this market. And no matter how recommended a router may be, I just won’t trust that my interests will come first with that company.

Startup offers $3 million to anyone who can hack the iPhone

Motherboard:

The startup is called Crowdfense and is based in the United Arab Emirates. In an unusual move in the normally secretive industry of so-called zero-days, Crowdfense sent out a press release to reporters on Tuesday, advertising what it calls a bug bounty.

And:

Crowdfense’s director Andrea Zapparoli Manzoni told me that he and his company are trying to join that market, purchasing zero-days from independent researchers and then selling them to law enforcement and intelligence agencies.

And:

“When I think about government agencies I don’t think about the military part, I think about the civilian part, that works against crime, terrorism, and stuff like that,” Zapparoli told me in a phone interview. “We only focus on tools aimed at doing activities of law enforcement or intelligence, not aimed at destroying or deteriorating the functionality and effectiveness of the target systems—but only aimed at collecting intelligence.”

And:

The company has a budget of $10 million for this “bug bounty.” Its backers, for now, are also secret.

The mind reels. Unless I misread this piece, no part of their plan is to share any discovered vulnerabilities with Apple. This is straight, help us break the system, not make it better.

“Vetting customers is the most delicate part of our whole activity,” Zapparoli said.

I’m going to go out on a limb and guess that your customer list will remain a secret as well. This whole thing is chilling to me.

Florida police use dead man’s finger to try to unlock iPhone

Titillating headline, but read on:

Authorities in Florida showed up to a funeral home and tried to unlock a dead man’s cell phone using his finger.

And:

Largo Police Lt. Randall Chaney told the Tampa Bay Times that the detectives were trying to gain access to and protect data relevant to their investigation into Phillip’s death, as well as another investigation Phillip was involved in related to drugs.

And:

There is no expectation of privacy after a person passes away, so the move to access the iPhone by detectives was legal, but not necessarily appropriate or ethical, Charles Rose, a professor at Stetson University College of Law, told the Tampa Bay Times.

“While the deceased person doesn’t have a vested interest in the remains of their body, the family sure does, so it really doesn’t pass the smell test,” he told the newspaper. Even though a deceased person can no longer claim their property for themselves under their Fourth Amendment rights, whoever inherits the property at stake, such as family, can exercise those rights, he said.

I’ve long wondered about the legality of physically forcing someone to unlock their iPhone using their finger or their face. Does that legal status change when someone dies?

And what about FaceID? Will it work on a dead person whose eyes are open? Can attention detection tell if someone is dead?

UPDATE: Couple of good comments from JLMoran. Sounds like neither TouchID nor FaceID will work on a dead person, at least not without some extra trickery.

Someone is trying to extort iPhone crackers GrayShift with leaked code

Motherboard:

Law enforcement agencies across the country are buying or have expressed interest in buying GrayKey, a device that can unlock up-to-date iPhones. But Grayshift, the company that makes the device, has attracted some other attention as well.

Last week, an unknown party quietly leaked portions of GrayKey code onto the internet, and demanded over $15,000 from Grayshift—ironically, the price of an entry-level GrayKey—in order to stop publishing the material. The code itself does not appear to be particularly sensitive, but Grayshift confirmed to Motherboard the brief data leak that led to the extortion attempt.

The mind reels. If some organization comes up with a golden key that unlocks all iPhones, that golden key will find its way into nefarious hands. This is living proof of that.

What to do if your iPhone is stolen, what you can do now to make that less painful

Nice writeup by Andrew Orr for The Mac Observer. This is one of those posts that worth scanning now, while you are feet up with a cup of coffee, rather than in a state of panicked response to your phone gone missing.

One note: Ignore the link to “How to Set Your iOS Device Data to Auto-Destruct” on that page. As pointed out in the comments, it’s outdated and no longer accurate.

UPDATE: Outdated link was deleted from the Mac Observer article.

James Comey’s new book, privacy, and Apple

9to5Mac’s Ben Lovejoy just finished reading James Comey’s new book, A Higher Loyalty. Politics aside, a section of the book deals with the FBI’s battle with Apple to access an iPhone used by a San Bernardino gunmen, detailed on this Wikipedia page.

Ben briefly excerpts Comey’s book, interleaving his own take with relevant passages. Short and worth the read.

Motherboard: Cops around the country can now unlock iPhones, records show

Motherboard:

Police forces and federal agencies around the country have bought relatively cheap tools to unlock up-to-date iPhones and bypass their encryption, according to a Motherboard investigation based on several caches of internal agency documents, online records, and conversations with law enforcement officials.

And:

Regional police forces, such as the Maryland State Police and Indiana State Police, are procuring a technology called ‘GrayKey’ which can break into iPhones, including the iPhone X running the latest operating system iOS 11.

Is this whack-a-mole? Will Apple be able to change iOS to break GrayKey? And, if so, how long will it take for GrayKey, or another technology, to ship a replacement?

New web standard would allow Touch ID and Face ID to be used to login to websites

Ben Lovejoy, 9to5Mac:

A new web standard being recommended for adoption would open the way for both Face ID and Touch ID to be used to login to websites.

The API, known as WebAuthn, allows existing security devices – like fingerprint readers, cameras and USB keys – to be used for website authentication

And:

There’s as yet no word on Safari, but with all current and recent iPhones and iPads offering either Face ID or Touch ID, and the latter supported on the MacBook Pro too, this would be tailor-made for Apple. It cannot be used with other browsers without Apple’s support.

Very interesting.

Finding out some of the info that Google keeps on you

This time, we’re talking Google, not Facebook. These links come from this Twitter thread posted by Dylan Curran. Good stuff.

Jump into your browser and click:

There’s a lot more. Step through the thread. Or download your Google data using this link.

Facebook’s official response on accusations of logging call and SMS history without permission

Facebook:

You may have seen some recent reports that Facebook has been logging people’s call and SMS (text) history without their permission.

This is not the case.

And:

Call and text history logging is part of an opt-in feature for people using Messenger or Facebook Lite on Android. This helps you find and stay connected with the people you care about, and provides you with a better experience across Facebook. People have to expressly agree to use this feature. If, at any time, they no longer wish to use this feature they can turn it off in settings, or here for Facebook Lite users, and all previously shared call and text history shared via that app is deleted. While we receive certain permissions from Android, uploading this information has always been opt-in only.

There is a lot to process here. I can tell you that I never intended for Facebook to keep all this data on me. Clearly, at some point, I must have opted in, I can accept that, but seeing the level of detail Facebook has kept feels disingenuous at best.

When I opt-in to allow an app to access my contacts for example, I’m thinking “use as needed”, not “scrape as much data as possible and squirrel it away”. Worlds of difference between those two.

And what the hell is Facial Recognition Data and why is Facebook saving it? If you haven’t seen this yet, check this post.

Wonder what Google has on you? We’ll get to that next.

To find suspects, police quietly turn to Google, seek devices near crime scenes

Tyler Dukes, WRAL, Raleigh, North Carolina, reporting on two unrelated murders:

In March 2017, months after investigations began into both shootings, separate detectives on each case, one day apart, employed an innovative strategy in criminal investigations.

On a satellite image, they drew shapes around the crime scenes, marking the coordinates on the map. Then they convinced a Wake County judge they had enough probable cause to order Google to hand over account identifiers on every single cell phone that crossed the digital cordon during certain times.

And on reactions from defense attorneys and privacy advocates:

They’re mixed on how law enforcement turns to Google’s massive cache of user data, especially without a clear target in mind. And they’re concerned about the potential to snag innocent users, many of whom might not know just how closely the company tracks their every move.

To get a sense of just how much location tracking Google does, check out this Quartz post from last November:

Many people realize that smartphones track their locations. But what if you actively turn off location services, haven’t used any apps, and haven’t even inserted a carrier SIM card?

Even if you take all of those precautions, phones running Android software gather data about your location and send it back to Google when they’re connected to the internet, a Quartz investigation has revealed.

According to this story, and others I’ve read, Google can track your location, even if you take out your SIM card. Amazing.

Read both of these stories. They are riveting and chilling.

Apple adds new Families page, gathering all parental tools in one place

Ina Fried, Axios:

A new page on Apple’s website details its efforts to make Macs and iPhones family friendly, including parental controls and other safety features. The move comes as Apple and other tech giants are under fire over whether their products are addictive, especially for children.

From this letter to Apple from a collective of Apple investors:

We have reviewed the evidence and we believe there is a clear need for Apple to offer parents more choices and tools to help them ensure that young consumers are using your products in an optimal manner. By doing so, we believe Apple would once again be playing a pioneering role, this time by setting an example about the obligations of technology companies to their youngest customers.

Apple’s new page is here. Definitely a step in the right direction, a single stop for learning about tools and resources for keeping your family safe.

Google and HTTP

Dave Winer:

I’ve been writing about Google’s efforts to deprecate HTTP, the protocol of the web. This is a summary of why I am opposed to this.

DaveW’s take on Google’s pitch:

  1. Something bad could happen to my pages in transit from a HTTP server to the user’s web browser.
  2. It’s not hard to convert to HTTPS and it doesn’t cost a lot.
  3. Google is going to warn people about my site being “not secure.” So if I don’t want people to be scared away, I should bend to their will (as if the web were their platform).

The rest of the article is Dave’s rebuttal, a thoughtful read from a very smart someone who knows this stuff inside and out. A few bits:

Google is a guest on the web, as we all are. Guests don’t make the rules.

And:

A lot of the web consists of archives. Files put in places that no one maintains. They just work. There’s no one there to do the work that Google wants all sites to do.

And:

Google has spent a lot of effort to convince you that HTTP is not good. Let me have the floor for a moment to tell you why HTTP is the best thing ever.

If you care about HTTP vs HTTPS, take a few minutes to read Dave Winer’s post. Then dig into Nick Heer’s excellent In Defence of Surfing the Insecure Web for a bit wider perspective.

Neither take is about bashing HTTPS, or about ditching security in any way. It’s about thinking carefully before ditching openness and about how decisions about the internet are and should be made.

Good stuff.