John Gruber’s right: Google is probably lying when it says it was surprised by Apple’s decision to build a new Maps app not using Google’s data. But the advantage is that it’s plausible enough that they can let Apple stew for awhile (six months?) before it decides if it should release an iOS maps app without getting blamed for being dicks. Apple’s iOS Maps have been inferior to Android’s for years, because Google held the best stuff for itself, and the folks at Google are smart enough to know that Apple would have a hard time building a suitable mapping replacement in time for iOS 6.
So, by Google failing to release a maps app for iOS, the difference between iOS and Android maps is made greater (advantage Google).
If Google, on the other hand, had released its own maps app using its own data for iOS, like it did for YouTube when Apple dropped it from iOS 6, the story wouldn’t be “Apple’s Maps sucks.” Instead it’s “Apple’s Maps sucks, but Google Maps is fine, just install that.”
The latter doesn’t become the subject of a column in The New York Times from David Pogue. The former does.
Brian Lam does technology writing differently at The Wirecutter. Instead of writing about every single dodad and gizmo, he writes “a list of great technology,” aiming to only tell readers what the best thing in a category is. Of course he wrote about the new iPhone:
These things are always the same. But better in small but meaningful ways. That’s all I remember from today’s news, really.
It’s also pretty much the same thing Apple says on their website and on the website of every other publication that writes about this stuff. It’s also pretty much what I wrote for the 4s and the 4 and the 3gs and the 3g, too. I feel despair when I am forced to write words that provide no service or additional value, but there’s a balancing act between saying what I think is useful and saying what people want to hear, so here we are.
Should you get one? If you want, sure.
His post is short, to the point, and not breathless, all of which is, sadly, refreshing in the world on online-gadget writing.
We all should have known Twitter was headed in the wrong direction when it started using “grow” and “evolve” as transitive verbs.
For a long time, because these are the kinds of things I think and worry about, I’ve wondered what would be enshrined at a hall of fame for sandwiches. And, on a long drive back from vacation following the consumption of several cheese steaks, I had some time to nail it down.
First, the criteria for candidacy:
The inductee must be a sandwich. It must involve bread with a filling. This seems obvious, but with the growth of paleo, gluten-free diets, sandwich shops selling wraps, and other trends, it’s important to be explicit. Sandwiches that use rolls or other forms of bread instead of sliced bread and open-faced sandwiches that use a single slice of bread will be considered for inclusion. “Flat bread,” the term some restaurants seem to be adopting because they don’t want to say they serve pizza, is not a sandwich.
The inductee must be an all-time-great. No second-tier, or fad sandwiches will be inducted. When looking down the list of inductees, you should see a list of sandwiches that are a who’s who of the world’s sandwiches.
The inductee should have cultural significance. While we consider the sandwich’s taste, we’re not interested in amazing-tasting sandwiches that no one has ever heard of or eaten. Regional specialties are eligible (and, often, strong candidates).
The inductee must be a canonical version of the sandwich, though some variations are acceptable. While many sandwiches are so good that they’ve spawned their own variations, the considered sandwich must be a canonical version, not a entire class of sandwiches. This, perhaps, the hardest part to lay out, but generally means that the composition of the sandwich should be understood by its name. So while hoagies or po’ boys might be mighty fine sandwiches, they are, in the end, platforms that require some explanation; you can’t simply walk into a sandwich shop, order “a hoagie” and know what exactly you’ll get. A cheeseburger, while there exist infinite variations, is understood to be a bun, a beef patty and melted cheese. This is similar how the martini might be inducted into a cocktail hall of fame. Leaving aside the gin verses vodka debate, the drink would be inducted as a whole, and a martini garnished with a twist would not be inducted separately from a martini garnished with olive. The hall is not interested in defining the One True Version of a sandwich, but, rather, the acceptable parameters for a sandwich with a specific name.
And now, the inaugural class of the International Sandwich Hall of Fame:
Reuben: rye bread, corned beef, sauerkraut, Swiss cheese, thousand island or Russian dressing
Cheese steak: Italian roll; thinly sliced beef; white American, provolone or Cheez Whiz; fried onions are optional, though encouraged
BLT: bacon, lettuce, tomato and mayonnaise
Peanut Butter and Jelly: smooth or crunchy peanut butter, strawberry or grape jelly, white or wheat sandwich bread
Cheeseburger: bun; beef patty; Cheddar or American cheese; ketchup, mustard, mayonnaise, lettuce, tomato, onion and pickle are optional
What would you add?
The idea of creating a “character” from an Apple employee is… well…. damn, I can’t even say this without feeling awful… it feels like something Best Buy would do. Maybe even Dell.
I think this is the problem. When I first saw them, I thought they could easily be Microsoft ads.
Even if the ads appeal to “people who’ve never bought a Mac but are thinking about buying their first,” which John Gruber says should be the test, there are ways to appeal to that segment and to current users that don’t stoop to the normally low comedic standards of the advertising industry.
I’m not a Mac owner, though if I bought a new computer today, it would most likely be a Mac. For what it’s worth, I think the ads are dumb, but they wouldn’t make a difference to me one way or the other. I asked my wife, also not a Mac owner and less likely than I to be one, what she thought when “Mayday” came on during a break in the Olympics last night. Her response: “I thought it was dumb that a guy felt he could make up for forgetting an anniversary making a video that didn’t take any work.”
Yesterday, Twitter cut Instagram’s API access that had allowed Instagram users to easily connect to the people they followed on Twitter on the photography network. Dan Frommer suggests it was payback for Instagram’s soon-to-be owner Facebook for blocking Twitter’s ability to let its users look up their Facebook friend on Twitter.
Users’ social graphs are valuable to social networks. Allowing users to find their friends on other services benefits the users, but it also benefits social networks by making them somewhat of a canonical list of your Internet friends. But when one network grows large enough to challenge another network, the interest changes to preventing users from moving easily to the new service.
Now here’s where I make a wild speculation there might not be evidence for: do these moves actually cede power to Apple? Maybe Apple, which has shown its social-network ineptitude, doesn’t care. But Apple’s iOS 6 (and, I believe, a future OS X Mountain Lion update) integrates Facebook, Twitter and contacts, making it, perhaps the way millions of iPad, iPhone, iPod Touch and Mac users move between and integrate the two services.
Mark Emmert, president of the NCAA, while announcing the penalties against Penn State and its football program:
Football will never again be placed ahead of educating, nurturing and protecting young people
I’m glad there are penalties, but how does he say that with a straight face?
If you want to keep the software and services around that you enjoy, do what you can to make their businesses successful enough that it’s more attractive to keep running them than to be hired by a big tech company.
The companies he mentions both sold paid Mac and iOS apps (Pulp, Wallet and Sparrow). The problem users face in this space strikes me as similar to the another point Arment made about advertisers outbidding users for their own attention.
I don’t know what Sparrow’s expected, but they knew they were competing with free when they built a paid Gmail client. Users bought the software. Development was still killed. How can users compete with that?
Rian van der Merwe writes a similar thing, more eloquently than I:
But… that’s what I did. I paid full price for every version of the Sparrow app I could find. I told everyone who would listen to buy it. I couldn’t have given them more money even if I wanted to. So, as a customer, what more could I have done to keep them running independently?
This is the core of the disappointment that many of us feel with the Sparrow acquisition. It’s not about the $15 or less we spent on the apps. It’s not about the team’s well-deserved payout. It’s about the loss of faith in a philosophy that we thought was a sustainable way to ensure a healthy future for independent software development, where most innovation happens.