On the surface, it may seem that macOS Mojave is an extremely minor update. Other than Dark Mode and the reliance on Metal graphics, it doesn’t seem a whole lot different when you look it over, as I did starting last month. But the mere fact of choosing Metal means that Macs without support for that graphics technology have been made obsolete.
Before Mojave was announced, I had planned (hoped) to test the betas on my 2010 17-inch MacBook Pro. Obviously that’s not possible, despite the fact that it has an SSD formatted with the APFS file system. That’s because its graphics hardware, state of the art eight years ago, preceded the arrival of Metal.
A 2012 MacBook Pro, where a Retina display debuted on Macs, works just fine. So do older Mac Pros with graphics cards that support Metal. So, my only option was the iMac. With a Fusion drive, it lost out on the APFS conversion last year, because Apple couldn’t make it compatible. It appeared on the early betas of High Sierra, but was soon pulled.
There was a certain promise from Apple software chief Craig Federighi that APFS support would return in a “future update.” Nothing more was said on the subject until May, weeks ahead of the WWDC and the launch of High Sierra’s successor, Mojave. I wouldn’t assume Federighi expected it wouldn’t arrive till then, but if he knew it would take the full year all along, he wouldn’t admit it.
This time it was clear APFS was expected to work. So, with multiple backups, I was willing to take the chance. If something went wrong, I could just restore the computer.
My only concern at the time was the report from Rogue Amoeba, publisher of Audio Hijack, which we use to capture audio for the radio show, that it wasn’t compatible with Mojave. Apparently the ACE component, used for instant capture, doesn’t work as of this writing. So far, the publisher hasn’t even hinted at when that update will arrive, though it is expected to appear when Mojave is released. I asked their support people if I might make it work without ACE, and the answer wasn’t definite.
Based on experience with previous versions of macOS, where this component had to be updated, I suspect that the main issue would be that I couldn’t capture audio with an app running. Audio Hijack would have to launch it first. If the app is running, it’ll put up a prompt that you click to quit and relaunch the app. Yes, an assumption, but I decided to go for it.
So on a Friday night, I backed up all my content via Carbon Copy Cloner to a second drive. I was ready.
I didn’t monitor the entire installation, except for an occasional glance. When I woke up the next morning, my iMac was running Mojave, and for the most part it didn’t look terribly different. Well, until I launched Disk Utility, and discovered that the drive was indeed using APFS. There was no warning and no option to block it. There it was, and it seemed OK.
I assume Apple has tested Fusion drives to know that it would be successful, and so far Mojave is mostly behaving. I do see slightly speedier performance, and I like the idea of being able to duplicate files to another portion of the drive almost instantaneously.
But what about Audio Hijack?
I launched it, selected my workflow and started a recording. As I suspected, Skype launched and everything went normally. If, however, I started a recording while Skype was running, Audio Hijack would put up a prompt to quit and relaunch. That’s no different from the way it worked before the ACE or instant capture component was developed.
I’m still waiting for an update from Rogue Amoeba — they aren’t sure when it’ll be ready — but I’m happy to accept this very minor inconvenience to produce my radio shows. Now maybe some other features, such as scheduling, are also affected, but I don’t use them.
As for Mojave, it does seem a tad snappier, but I’ll await official benchmarks with the release version. The iMac’s startup takes nearly twice as long, though. It stops a little more than halfway through, and resumes a short time later. I assume that glitch will also disappear from the release version, though I grant that Apple has allowed OS bugs to persist through a beta process in the past.
This week, Apple released developer beta six, which is the fifth beta made available to public beta testers. Within the next four to six weeks, a Golden Master candidate ought to be out, which means that the rest of the development process will mostly involve fine tuning. Once it’s released, Apple will go full steam into the first update, 10.14.1.
In the meantime, I’ve looked over Mojave’s Dark Mode and turned it off. Maybe I’m too set in my ways. Unfortunately, the latest beta has essentially made my Brother printer useless. Whenever I attempt to print a document, any document, the printer driver app displays an error, “Unable to startup session, error =-10.”
I went through the usual troubleshooting routines, including restarting. resetting the iMac’s printing system, resetting the printer, and reinstalling the latest drivers from Brother. They are rated for High Sierra. So this is clearly a problem that Apple or Brother — or both — must fix.
For now, I can’t print, since my other printer, an Epson All-in-One, is at a storage facility along with most of my stuff.
So I’m bummed out a little, though it is time I cure the printing habit, so maybe it’s all good.
As to Mojave, I don’t regret installing the betas. At this late stage, it’s probably in decent shape — well except for that printer glitch. But if you’ve waited this long without fretting over it, you might as well wait for the final release that’ll probably arrive at the end of September or early in October.
What I read last week is so typical of anti-Apple foolishness, but I was hardly surprised. As you know, we’re less than two months away from an expected Apple event to introduce new iPhones and no doubt an updated Apple Watch. Whether or not any other gear will be launched is a question mark, even though new iPads and Macs (in addition to the ones launched last week) are expected.
But it’s not too early for the usual gang of Apple haters to claim that whatever is going to happen is wrongheaded, that the company with the world’s largest market cap is just incapable of doing things right. Or perhaps following the foolish speculation from a wayward and ill-informed blogger. If Apple doesn’t follow the erratic and illogical twists and turns of would-be journalists, they will never succeed. All that’s happened to them so far is some gigantic fluke.
Some day, any time now, the market will self-correct and the “right” companies will retake control.
Any time now.
With that in mind, there has been some new speculation from industry analyst Ming-Chi Kuo, whose predictions about new Apple gear are often close to the mark. On that basis, There are reports about refreshed Macs, including the long-awaited Mac mini, a new low-end Mac notebook, and refreshes for the rest of the lineup, and the iPad. But there is one wrinkle with Apple’s tablet, an 11-inch model that evidently replaces the 10.5-in model, and the addition of Face ID to replace Touch ID.
So far, we have the 2018 MacBook Pro sporting huge speed and feature improvements, including a six-core CPU and an available 4TB SSD for a humongous amount of money as notebook computers go.
His predictions about the iPhone haven’t changed. There will be two versions of the iPhone X, a minor update to the existing model and an iPhone X Plus with a 6.5-inch display. Prices may drop on the cheaper model by $100, which would leave the bigger handset at the same price point as the previous iPhone X that was regarded as too expensive at the same time it became a top seller. In addition, he predicts an LCD model with a 6.1-inch TFT LCD display. The usual range of older models will probably stick around with lower prices, but what about the iPhone SE?
Speculation about the iPhone updates is enough to send the hater hearts aflutter. Especially the lack of an update for the smallest and cheapest iPhone. Does that mean there won’t be any changes, or maybe the existence of an update has been overlooked so far? Well, there was some speculation about a minor CPU update earlier this year, which has faded.
So does that mean the rumored iPhone SE 2 will never appear?
Obviously the complaint is that failing to produce a low-end model is a bad move for Apple, and thus you can expect abject failure this fall. I don’t pretend to know how many units have been sold so far, though it’s clear most customers appear to prefer the units with larger displays, and, in fact, the most expensive models. But that doesn’t mean that Apple is ignorant of the fact that lots of people still want smaller handsets, and why not satisfy those needs?
The current iPhone SE uses parts from the iPhone 6s, but that’s no reason to panic, for one reason that the distressed blogger overlooks, or maybe doesn’t know about.
As most readers know, Apple is touting huge performance boosts on iPhones with iOS 12, focusing on the iPhone 6 as receiving improvements of 50% or faster. While no promises are being made about the iPhone 6s, one assumes it’ll also receive a decent level of improvement, and you should expect that too with the iPhone SE.
So even without any change, the existing SE should deliver more than enough performance for most users. Sure, the camera won’t change, but it delivers pretty good photos as it is.
But maybe Apple is planning an SE update. While this model is not a huge seller in the scheme of things, even sales of a few million are quite significant and would be substantial for most other companies. A speed bump and camera enhancement, without actually changing the look, would involve at best a minor R&D expenditure to Apple.
Then again, one might raise the very same logical arguments for updating the Mac mini, which hasn’t been touched in four years. That is an even less defensible position, especially since the last revision actually downgraded the model apparently in exchange for a $100 reduction in price. You could no longer upgrade RAM, and the CPUs topped out at two cores. The four-core models that some cherished for use in data centers were no longer available, and Apple never explained why.
Now it; only one Mac line has been enhanced this year so far.
It’s also quite possible that the iPhone SE won’t be changed this year, or that Apple will replace it with something else that’s still relatively small, but perhaps has further enhancements to bring it in line with current models.
But if that doesn’t happen, it won’t signal a fatal mistake for Apple so long as people are still buying the existing SE. Obviously Apple has no obligation to meet the demands of yet another blogger with an inflated sense of self worth.
Since I’ve been largely in cheapskate mode in recent years, I seek ways to save money. I no longer pay $99 to join the Apple Developer Program. At most I miss one or two early previews after the annual WWDC. Otherwise, a public beta release is usually released no more than a day after the developer version, unless there’s something really bad that has to be fixed first.
With the release of the iOS 12 public beta, I went ahead and downloaded it for installation on a late model iPhone. It is possible to restore your device if something goes wrong by downloading a previous version (not to worry, it’s searchable). So I took the plunge.
The first step requires installing Apple’s device profile on your iOS device, so it will be able to alert you, download and install the new releases.
Since this week’s release is the first for regular folk, don’t expect miracles. The final or near-final version won’t be out for two months. That said, my initial experience, after about a day, hasn’t been so bad. The symptoms are largely about flakiness. So sometimes, when I try to delete an email, the Trash icon isn’t there, and backing up through the menus and returning brings it back.
A handful of web pages stay white and never render or refresh, but it’s not consistent. So far, at least, there have been no crashes.
One of the tentpole features of iOS 12 is not something you can see. It’s the promise of faster app launches, faster keyboard display, and speedier swiping to the camera, ranging from “up to” 50-100%. The highest boost is promised for an iPhone 6.
I read an early review of the first iOS 12 release for developers, in which the promised performance leaps were tested. It was a mixed bag, with some of the touted functions coming close to matching Apple’s claims, and some not-so-different. To be fair, early betas aren’t optimized for performance. Better to test this with a final or near-final release.
So I didn’t bother to actually check speeds. My subjective impressions were positive. It seems to boot faster, and most things appear to be snappier. Both Lyft and Uber, which used to take maybe six seconds to load, took roughly three seconds to launch with the iOS 12 public beta. The zooming effect appears faster and smoother, with no overt evidence of stuttering. Most interesting is the fact that, even though I’m at a motel with three megabit Wi-Fi connection, my iPhone didn’t didn’t feel that slow at online access.
Assuming faster performance is all or mostly across the board, it means that the same hardware that worked with iOS 12 will of a sudden run faster. This is very much against the grain, where the oldest supported hardware usually becomes unbearably sluggish with a new iOS release.
Indeed, it’s very likely some people buy new gear not because what they have doesn’t work so well, but because performance has deteriorated so much — and not just because the battery is spent and the CPU is being throttled. Thus, Apple might possibly sell fewer iPhones. But I expect Tim Cook and his team expect people, will be more satisfied that their gear is not exhibiting many overt signs of aging and will be just as inclined to upgrade, or more will stick with iPhones.
I am not considering how well an Android device ages since many of them never receive a new OS release.
I am interested in the FaceTime Group feature as a possible substitute for Skype, at least for audio-only use, but that means that guests for my shows will need to use Apple gear. I’ll keep it in mind.
The added security and privacy features, including a proper password manager and default blocking of social network interactions, are welcome. But they aren’t things people will necessary notice until they began to seriously look around.
I’m also intrigued by yet another promise of a better Siri — last year’s promise wasn’t fully realized — and I’ll give it a chance and see if I can reliably take it beyond simple alarms.
For the most part, you should be able to install an IOS public beta without seeing much in the way of front-facing changes, at least at the start. Although I’ve seen over 200 new or changed features listed, they are largely more subtle than usual. This may be in keeping with the rumor that Apple is focusing more on performance and reliability than adding cool stuff, that some key features are being held off to 2019.
It’s not that Apple plans to say that a new OS release is less than originally planned. But it’s also true that some features may be delayed or omitted because they aren’t perfected. But customers shouldn’t have to wait for months for AirPlay 2 and other promised features either. On the other hand, it may also be possible that this will be the norm, that some things will be rolled out through the year as they are ready. It’s not that Apple has to worry about readying an OS for retail sale.
But even though my initial experiences with iOS 12 are positive, I urge you to be careful about installing a beta OS unless you have a ready backup routine.
I will hold off installing a macOS Mojave beta until it’s closer to release. I no longer have a backup computer, since the next OS won’t run on my 2010 MacBook Pro.
Microsoft’s Xbox Adaptive Controller might be the greatest gamer thing ever. Certainly so for gamers with disabilities. As you may or may not know - gaming controllers are a beast to handle. They have two small joysticks, one D pad, four top buttons, two trigger buttons and two bumper buttons. “I’m not really good with a controller” is a common complaint from able bodied gamers. I’ve been gaming for decades and I still find myself going, “Whoops! That was the wrong button - and now my character is dead.”
But the point is clear - you pretty much need both hands to finely manipulate the controller and the stamina to handle it for many, many hours. Which puts disabled folks at a colossal disadvantage just to play a console game. What if said person only has one hand? Or can’t use either hand. Or doesn’t have the strength to push the buttons or pull the triggers. Or doesn’t have the motor functions to properly handle the control. Or what if they are quadriplegic?
Their options are limited and usually expensive. There are some third party companies that manufacture controller accessories that help those with physical disabilities, but no company as large as Microsoft has tackled this effort. Until, obviously - now. And they spared no expense.
So what is the Adaptive Controller?
“By taking an inclusive design approach and considerations of gamers who might not be able to reach all the bumpers and triggers or hold a controller for an extended period of time, for example, we were able to design a controller that provides a way for more fans to enjoy gaming. On our journey of inclusive design, we have taken a wider view of our fans and a more inclusive approach to designing for them.
For gamers with limited mobility, finding controller solutions to fit their individual needs has been challenging. The solutions that exist today are often expensive, hard to find, or require significant technical skill to create. A number of individuals and organizations are creating custom solutions, but it has been often difficult for them to scale when most rigs need to be so personalized.
Joining the Xbox family of controllers and devices, the Xbox Adaptive Controller was created to address these challenges and remove barriers to gaming by being adaptable to more gamers’ needs. It was developed in partnership with organizations around the world, including The AbleGamers Charity, The Cerebral Palsy Foundation, Craig Hospital, SpecialEffect, and Warfighter Engaged. We worked closely with them and directly with gamers who have limited mobility to assist in our development. Our goal was to make the device as adaptable as possible, so gamers can create a setup that works for them in a way that is plug-and-play, extensible, and affordable. In addition to working with common adaptive switches that gamers with limited mobility may already own, it has two large buttons built in. These buttons can also be reprogrammed to act as any of the standard controller’s button inputs via the Xbox Accessories app.”
The Adaptive Controller releases in September and will retail at a very reasonable $100. There is already a whole array of accessories available on the Xbox site. The nice thing about the controller is that it has nineteen 3.5mm ports and two USB 2.0 ports of external inputs. A lot of those third party accessories that disabled gamers use to “hack” the regular Xbox gear will be fully compatible with the Adaptive Controller. So one wouldn’t have to buy all new expensive accessories as your old ones will allow you to just plug into the Adaptive Controller. For example, foot controllers or even mouth controllers will be able to plug directly into the Adaptive Controller, etc., etc.
According to worldbank.org, fifteen percent of the world’s population (approx. 1 billion) has some form of disability and between 110 and 190 million people experience some kind of significant disability.
I am certainly not going to suggest that the Adaptive Controller is about to save the world by allowing much, much easier access to Mass Effect or Tomb Raider via Xbox One. But - this is certainly a step in the right direction.
As an avid gamer I often want to share this AWESOME experience I just had with game X! It never even occurred to me that this simple pleasure of game experience was extremely difficult for such a large number of folks. I mean, if you’re able bodied and you just don’t want to game - that’s all good. But if you’re physically impaired and you really, really want to game - but you are unable to do so because of your disability - that fucking sucks.
And Microsoft is doing something about that.
Well played, Microsoft. Well played, indeed.
It wasn’t so many months ago when there were loads of reports that Apple’s great experiment, the iPhone X, was a huge failure. Inventories were growing, there were major cutbacks in production. All this allegedly based on reports from the supply chain.
Such blatant examples of fake news aren’t new. It happens almost every winter. After a December quarter and peak sales, Apple routinely cuts back on production from the March quarter. It’s not the only company to follow such seasonal trends, but somehow Apple gets the lion’s share of the attention.
From time to time, Tim Cook schools the media about relying on a few supply chain metrics, reminding them that, even if true, it doesn’t necessarily provide a full picture of supply and demand.
He might as well be talking to himself since he’s almost always ignored.
In any case, the numbers from the December and March quarters painted a decidedly different picture than those rumors depicted. The iPhone X was the number one best selling smartphone on Planet Earth for every week it was on sale. I don’t know if the trend has continued, but Apple has nothing to apologize for.
Now one of the memes presented in the days preceding the arrival of the iPhone X — before the talk about its non-existent failure arose — was that it would fuel a super upgrade cycle. Up till then, the usual two-year replacement scenario was beginning to fade. In part this was due to the end of the subsidized cell phone contract in the U.S. fueled by T-Mobile’s supposedly innovative “Uncarrier” plans. They appeared to represent something different, but at the end of the day, not so different in what you had to pay, at least for the term of your smartphone purchase.
Originally, you’d acquire a cell phone either by buying the unit outright, or signing up for a two-year contract in which you’d pay something — or nothing — upfront and then be obligated to keep the service in force for at least two years. If you cancelled early, you’d pay a penalty to cover what the carrier presumably lost because you didn’t pay off the device.
After two years, you’d be able to cancel your contract without penalty, but if you kept it in force, the price wouldn’t change even though the device had been paid off. It was a boon to the carrier if you didn’t upgrade. But if you did, the two-year requirement would start all over again.
With an “Uncarrier” deal, the cell phone purchase was separated from your wireless service. You could buy it, add one you own to the service if it was compatible, or acquire a new handset for an upfront payment, plus a given amount every month until it was paid off. It was essentially a no-interest loan, but you’d have the right to exchange it for a new device after a certain amount of time, usually 12 to 18 months. This way, the purchase became a lease, and you’d never own anything. In exchange for getting new hardware on a regular basis, you’d never stop paying.
What it also meant is that, once your device was paid off, the price would go down, giving you an incentive to keep your hardware longer if it continued to perform to your expectations.
Now that predicted iPhone super upgrade cycle didn’t occur as predicted. Yes, iPhone sales did increase a tiny bit in the last quarter, but revenue has soared because the iPhone X dominated new purchases, thus boosting the average transaction price. That, too, was contrary to all those predictions that Apple’s most expensive smartphone was way overpriced, and customers were reacting negatively.
How dare Apple charge $999 and up for a new handset?
Rarely mentioned was the fact that Samsung, Pixel Phone by Google and other mobile handsets makers also sold higher-priced gear, but there were few complaints. It’s not that sales were great shakes, but some regarded such handsets as certain iPhone killers, except that Apple overwhelmed these products in sales.
So where’s what’s the latest alleged super (duper?) upgrade cycle about?
Well, according to a published report in AppleInsider, Daniel Ives of GBH Insight claims that “the Street is now starting to fully appreciate the massive iPhone upgrade opportunity on the horizon for the next 12 to 18 months with three new smartphones slated for release.”
Deja vu all over again?
For now Apple has become a Wall Street darling. But don’t bet on that continuing. The next time someone finds reasons, real or imagined, to attack Apple’s prospects for success, the stock price will drop again. Of course, there are other reasons for stock prices to vary, including the state of economy, possible trade wars and other reasons, including investor psychology.
So what is Ives expecting?
He is projecting that Apple might sell up to 350 million iPhones over a period of 18 months after this fall’s new product introductions. Supposedly they will be so compelling that people who might have otherwise sat on the sidelines and kept their existing gear will rush to upgrade.
As regular readers might recall, predictions have focused on a new iPhone X and a larger iPhone X Plus, plus a regular iPhone with an edge-to-edge LCD display. Will there be an iPhone 8 refresh, or will Apple just sell last year’s models at a lower price? What about a smaller model, the alleged iPhone SE 2?
I don’t disbelieve the rumors about the 2018 iPhone lineup, but predictions of super upgrade cycles may not be so credible. People appear to be keeping their smartphones longer, so long as they continue to deliver satisfactory performance. And. no, I won’t even begin to consider the performance throttling non-scandal.
The question is the same: How will Apple distribute all the TV shows it will create over the next year or so. Will it be a supplement to Apple Music, say Apple Music and TV, or will it be a totally different service? Will Apple just offer the shows a la carte, on a series pass, or establish a new streaming service?
Indeed, does the world even need another video streaming service?
Loads of people have Netflix around the world, and it’s producing hundreds of original shows every year. That and its large library of existing content is enough to satisfy many people, but it’s often supplemented by, say, a basic cable or satellite package that provides broadcast channels, a small number of free cable channels, and maybe some other odds and ends. Or perhaps a similar general-purpose streaming service, such as Sling TV from Dish Network or DirecTV NOW!
At one time, Apple was rumored to be working on a streaming TV package, but evidently couldn’t reach a deal with the networks. The reasons are unknown, but some suggest it was mostly about Apple’s stringent demands, a common excuse. By creating its own content, the entertainment companies are bypassed, but it would take years to build a big library, and that would ultimately involve returning to the industry to license content.
But the streaming sector is saturated. There are probably too many services as it is. So I still expect it to be included as part of Apple Music. While some suggest Apple wouldn’t recover the estimated billion dollars it’s spending on its TV project, it plans for the long term, and it doesn’t matter if the existing Apple Music structure doesn’t deliver a huge profit. It’s just one more factor that ties people into Apple’s ecosystem. Apple didn’t make profits from iTunes at first either.
Now on last weekend’s episode of The Tech Night Owl LIVE, we presented a special featured encore episode in which we presented Major General (Ret) Earl D. Matthews: He spent three decades at the nexus of big budgets and cybersecurity, including stints as Director, Cybersecurity Operations and Chief Information Security Officer at HQ, U.S. Air Force, and VP for Enterprise Security Solutions at Hewlett-Packard. In his current role as Senior VP and Chief Strategy
Officer at Verodin, Inc., he champions the concept of security instrumentation, a process that continuously validates the effectiveness of each security element in place. During this episode, he covered the gamut of cybersecurity issues that included the privacy issues at Facebook, the DNC hack, along with managing your personal privacy at a time when tens of millions of Americans have had their credit reports hacked. Major General Matthews also revealed two episodes of ID theft that impacted his own family.
You also heard from tech columnist and former industry analyst Joe Wilcox, who writes for BetaNews. During this episode, Joe explained why he regards Apple’s Siri voice assistant as worse than Microsoft’s Skype, despite all the connection glitches with the latter. Will hiring former Google executives help Apple make Siri more responsive and accurate, without sacrificing your security? You also heard about Google I/O and Android P, and about all those fake news reports that the iPhone X was unsuccessful. For two quarters straight, however, Apple reported that the iPhone X was not only its best selling Apple smartphone for each week it was on sale, but the hottest selling smartphone on the planet. Gene shared his 20 years experience with the iMac, which began with the original Bondi Blue model that he beta tested for Apple as part of the former Customer Quality Feedback (CQF) program. You also heard about the Apple
Watch and whether it makes sense for Apple to switch Macs from Intel to ARM CPUs.
On this week’s episode of our other radio show, The Paracast: On an anniversary of Kenneth Arnold’s UFO sighting, Gene and returning guest cohost J. Randall Murphy host cryptozoologist and Fortean Loren Coleman. Loren discusses the weird deaths of paranormal authors, Ufologists, Mothman-linked folks, Superman personalities, and celebrities. Do these clusters of deaths have any significance, or is it all about coincidence? Will those copycat hangings involving such notables as fashion designer Kate Spade and comic actor Robin Williams continue? What about the reality behind such creatures as Bigfoot, the Loch Ness Monster and Thunderbird? Loren is founder and director of the International Cryptozoology Museum in Portland, Maine.
READY FOR THE iPHONE KEY FOB?
I don’t recall the first time I bought a car with a keyless-entry remote control, commonly known as a key fob. I did some quick research the other day and ran across an item about the 1983 AMC/Renault Alliance as providing support for a remote that allowed you to lock and unlock the doors. But I was never a fan of AMC’s cars.
I also ran across a mention of a 1987 Cadillac Allanté, an ultimately unsuccessful attempt at building a two-seater roadster for the luxury brand. It was yet another car in which I had no interest whatsoever.
Now I can’t exactly recall the first car I purchased with one of these electronic gizmos, which are, of course, coded for a specific vehicle. It may have been a Honda Accord, but it still started conventionally. A real key popped out of the fob and the ignition assembly was traditional.
My last two cars, the one totaled last year and the one I own now, have a push button on the center console to turn the ignition on or off. Not so long ago, such a feature was mostly available strictly on expensive cars, but gradually filtered down to the more affordable vehicles.
While those key fobs are supposedly reasonably secure, they aren’t perfect. One of my guests on the tech show some months back, an “ethical hacker,” said you could compromise a key fob for many makes and models with a $35 device. It required being near the fob when it was scanned. I suppose that means they could be hiding in the bushes near a busy shopping plaza.
So is there a way for Apple to help provide a more secure option for a digital key system?
Well, according to a published report in AppleInsider, a new standard for digital keys has been published by the Car Connectivity Consortium, of which Apple is a member. It would allow you to use an NFC-enabled smartphone which includes both high-end Android and any iOS device that has Apple Pay support, to unlock doors and motor vehicles, among other things.
In effect, it would recognize your iPhone and other gear as a substitute for a key fob. Supposedly it’ll offer a higher level of security than existing keyless-entry systems can achieve, and I would believe that for Apple’s gear. Then again, if you can duplicate the data from an existing key fob with a $35 device, I suppose most any secured solution ought to be better. More to the point, however, would be the method by which your mobile handset is activated. Would it require pairing it with your existing key fob, or will automakers and lock manufacturers provide specific support to enable such a feature?
As with any new system protocol, however, it will take a while for you to see it appear. Car makers usually develop new vehicles a year or two out, so it may not happen until 2020; that is, assuming you can’t adapt existing systems, and I suspect you can’t if it requires custom chips to provide added security.
As with Apple CarPlay and Android Auto, CCC support may first appear on expensive vehicles and gradually filter down to the models that regular people can afford.
In other words, when the standard finally shows up in a wide variety of cars, I may be too old to care.
Starting with iOS 12, there will be a free feature in Apple’s software that will automatically (and securely) share your exact location data with first responders. I’ve seen a few cynics already crying that Apple will use this information for nefarious purposes but, all of us reasonable folk will instantly see the time and life saving value in the new application.
It takes time to get information from a caller. And then, sometimes the caller needs to be routed which costs - more time. And during an emergency the caller might not have time to spare.
The current problem with mobile phones is summed up on the FCC website:
“While wireless phones can be an important public safety tool, they also create unique challenges for emergency response personnel and wireless service providers. Since wireless phones are mobile, they are not associated with one fixed location or address. While the location of the cell site closest to the 911 caller may provide a general indication of the caller's location, that information is not always specific enough for rescue personnel to deliver assistance to the caller quickly.”
My understanding is that the nation’s six thousand emergency call centers are - well, outdated. The vast majority of the 911 call centers are still teched out for landlines. And if you call from a landline then the emergency center knows your exact fixed position. Which was great for a very long time. But 70 percent of emergency calls now come from mobile phones. And it sounds like that number is rapidly increasing.
Enter RapidSoS - a company that is focusing on upgrading the outdated call center tech. Michael Martin, co-founder and CEO of RapidSOS says this about his company:
“Last year, more than 10,000 people died when they could not relay fast and accurate information after calling 911. People in danger don’t always have the presence of mind to press the right numbers and explain where they are. But what if your smartphone could do that for you? That’s the idea behind RapidSOS’s smartphone app, Haven. With a single touch, it sends the 911 dispatcher your exact location.”
This upgrade probably won’t single handedly improve the first response rates for most callers because you will have to have an iPhone and you’ll have to have iOS12, etc. - but it will certainly get the ball rolling in the right direction for others to follow suit.
iOS12 will roll out in late 2018.
Over the years, some tech pundits have decided that Apple really needs to drop the Mac. To them, it has outlived its usefulness and, besides, far more money is made from selling iPhones.
But it’s a good source of hit bait to claim that “Mac users don’t really matter to Apple.”
Indeed, Apple has, at times, made it seem as if that claim was accurate. The Mac mini has not been refreshed since 2014. After releasing a total redesign for the Mac Pro in late 2013, Apple appeared to drop the ball and mostly abandoned that model.
When a new MacBook Pro was launched in late 2016, some thought the claim that it was a professional notebook was a huge exaggeration. It was thinner, in the spirit of recent Apple gear, but the highly touted Touch Bar, powered by an ARM system-on-a-chip, was thought to be fluff and not much else.
Apple also got dinged for things it had never done, such as supplying a model with 32GB of RAM. But that would have required using a different memory controller that might have impacted performance and battery life. In comparison, most PC notebooks were also limited to 16GB. A future Intel CPU update will offer an integrated memory controller that doubles memory capacity.
Just after Christmas, a Consumer Reports review failed to recommend the 2016 MacBook Pro supposedly due to inconsistent battery life. After Apple got involved, it turned out that CR’s peculiar testing scheme, which involves disabling the browser cache, triggered a rare bug. After Apple fixed it, a retest earned the MacBook Pro an unqualified recommendation.
Was all this proof that Apple just didn’t care about Macs?
Well, it’s a sure thing the Touch Bar wasn’t cheap to develop, and embedding an ARM chip in a Mac is definitely innovative. But Apple’s priorities appeared to have gone askew, as the company admitted during a small press roundtable in early 2017.
The executive team made apologies for taking the Mac Pro in the wrong direction, and promised that a new model with modular capabilities was under development, but it wouldn’t ship right away. There would, however, be a new version of the iMac with professional capabilities. VP Philip Schiller spoke briefly about loving the Mac mini, but quickly changed the subject.
Before the 2017 WWDC, I thought that Apple would merely offer more professional parts for customized 27-inch 5K iMacs. But such components as Intel Xeon-W CPUs and ECC memory would exceed that model’s resource threshold. So Apple extensively redesigned the cooling system to support workstation-grade parts.
The 2017 iMac Pro costs $4,999 and up, the most expensive, and most powerful, iMac ever. You can only upgrade RAM, but it’s a dealer only installation since it requires taking the unit completely apart, unlike the regular large iMac, where memory upgrades are a snap.
Apple promised that a new Mac Pro, which would meet the requirements of pros who want a box that’s easy to configure and upgrade, would appear in 2019, so maybe it’ll be demonstrated at a fall event where new Macs are expected.
But Apple surely wouldn’t have made the commitment to expensive Macs if it didn’t take the platform — and Mac users — seriously. The iMac Pro itself represents a significant development in all-in-one personal computers.
Don’t forget that the Mac, while dwarfed by the iPhone, still represents a major business for Apple. Mac market share is at its highest levels in years in a declining PC market, serving tens of millions of loyal users. When you want to develop an app for iOS, tvOS or watchOS, it has to be done on a Mac. That isn’t going to change. In addition, Apple is porting several iOS apps for macOS Mojave, and developers will have the tools to do the same next year.
According to software head Craig Federighi, iOS and macOS won’t merge and the Mac will not support touchscreens.
Sure, the Mac may play second fiddle to the iPhone, but that doesn’t diminish the company’s commitment to the platform. But it’s still easy for fear-mongering tech pundits to say otherwise, perhaps indirectly suggesting you shouldn’t buy a Mac because it will never be upgraded, or that upgrades will be half-hearted.
Perhaps there’s an ulterior motive behind some of those complaints; they are designed to discourage people from buying Macs and pushing them towards the latest PC boxes that, by and large, look the same as the previous PC boxes with some upgraded parts.
But since Intel has run late with recent CPU upgrades, Apple has often been forced to wait for the right components before refreshing Macs. That doesn’t excuse the way the Mac mini and the MacBook Air have been ignored, but I’ll cut Apple some slack with the Mac Pro, since a major update has been promised for next year.
Now this doesn’t mean the Mac isn’t going to undergo major changes in the coming years. Maybe Apple is becoming disgusted with Intel’s growing problems in upgrading its CPUs, and will move to ARM. Maybe not. But that’s then, this is now.
A few years back, I embarked on upgrade mission, to swap out the slow hard drive on my 2009 27-inch iMac and replace it with a nice and speedy SSD. With the cooperation of Larry O’Connor of Other World Computing, I got ahold of a 1TB drive and an upgrade kit, consisting of a few tools and suction caps. The latter was used to pry the display from the chassis.
Once the glass is extracted it’s supposed to be placed on a soft surface — I put it on a bed — the rest of the job largely involved carefully unhooking several thin wiring harnesses, easily damaged, and the drive. The manufacturer provides an adapter cable to make the new drive compatible with the iMac.
All told, it took about an hour to get through the process and reassemble the computer. O’Connor’s company offers installation videos on his site to simplify the process.
The reason I bring this up is the result of the first interview on this episode of The Tech Night Owl LIVE, where we were joined by tech columnist Rob Pegoraro, who writes for USA Today, Yahoo Finance, Wirecutter and other publications. At the beginning of this segment, Rob explained that he took apart his vintage 27-inch iMac, from 2009, in order to replace the drive with an SSD from Other Word Computing. Gene shared his experiences in upgrading a similar computer several years ago. In later iMacs, it’s held together with an adhesive strip, making the disassembly and reassembly process far more complicated. There was also a discussion about Siri’s voice recognition problems, and a recent report that someone’s Amazon Echo Dot, featuring Alexa, recorded a personal conversation and sent the file to a contact in another city.
Can we trust these digital assistants to respect our privacy? Rob also talked about a meeting with security experts discussing changes and possible improvements in online security over the past 20 years.
The Amazon scandal is also discussed in the next article.
After the interview with Rob was recorded, I contacted two local authorized third-party Apple repair shops as to whether they’d be able to upgrade the drive on a more recent 27-inch iMac and how much it would cost. The process involves removing the adhesive that holds the display to the chassis. It’s not something I’d care to tackle.
Well, the first dealer gave a flat no, saying that even trying would damage the computer. That didn’t sound right to me, since Apple uses a similar process to upgrade memory on the iMac Pro. It can’t be upgraded as simply as the regular large iMac, which has a RAM cover at the bottom. Maybe that particular dealer didn’t want to bother or had a bad experience or two.
A second dealer gave me a detailed quote that included labor, two adapters from Other World Computing, plus backup and restore. It came to $457.93!
When I looked at the numbers, though, it sort of made sense, since they charge $200 for a full backup and restore, $19.99 for the replacement adhesive strip, and $79.99 for the needed OWC and Newer adaptors. The actual labor comes to $150. OWC sells SSDs with the proper adaptors and the customer can always restore the data themselves, so the price could be as “low” as $169.99.
In a special encore presentation, you heard a vintage segment featuring Ben Williams of Adblock Plus. Ad blocking has experienced a lot of activity over the past year, especially since Google entered the fray with its ad filter for Chrome. There are still battles between publishers and ad blockers, and payment systems to publishers from users are being talked about with more frequency. Gene and Ben also engaged in an extended discussion about the value of online advertising, and the long history of making it as offensive as possible. There was also a fun pop culture discussion, about ads that build branding images based on using a well-known personality, such as Oscar winning actor J.K. Simmons, known for Farmers Insurance commercials and loads of movies and TV shows, including the recent comic book film, “Justice League,” where he played Commissioner Gordon. You also learned how ad blockers can be configured to allow ads that have been approved by Adblock Plus.
On this week’s episode of our other radio show, The Paracast: Gene and guest cohost Goggs Mackay present Dr. Jack Hunter, an anthropologist and author of “Engaging the Anomalous: Collected Essays on Anthropology, the Paranormal, Mediumship, and Extraordinary Experience.” In this book, Dr. Hunter poses serious questions about consciousness, experience, spirits, mediumship, psi, the nature of reality, and how best to investigate and understand them.
In this discussion, Dr. Hunter will present stories of personal experiences, encounters with mediums, and float a wide variety of suggestions as to how various paranormal phenomena might somehow be connected, and that includes the UFO mystery. Dr. Hunter is the founder and editor of a free online journal, Paranthropology.
SSSSHHHH: ALEXA IS LISTENING
Let me start with the Siri follies.
With growing concern that Apple’s Siri digital assistant isn’t capable of matching the competition from Amazon and Google, there are rumors that the next WWDC will feature news of a major refresh. Last year, Apple touted that Siri would receive a new voice and machine learning, but it’s not at all certain there has been much change beyond a smoother conversational tone.
A recent published report featured expressions of sour grapes from former Siri employees who worked at Apple, plus a claim that it worked fine when reporters tested it before it went public. But after it was launched, beginning with the iPhone 4s in 2011, Siri’s bugs were legion. Maybe it just couldn’t cope with massed requests under load.
The Night Owl’s personal experiences are hit or miss. Despite the fact that I have 25 years experience as a broadcaster, and a decade of voice training, Siri is sometimes deaf to me. A simple example is the request for Maps to navigate me to the location of the nearest Walmart. There happen to be several, a few miles apart, but Siri will only produce a list, and rarely does that list display the location I seek. I find it easier to search in Google and manually pick the store to which I want to travel.
But that process hardly makes it hands free. I have to stop somewhere first to make my selection. So I tend to focus on setting alarms or reminders, where Siri is mostly correct.
One excuse given for Siri’s subpar performance is that Apple doesn’t want to infringe on your privacy, so it doesn’t actively collect information about you that is pushed and stored beyond the device itself. The theory goes that, if access to your device and requests were more open, since Siri resides online, you’d achieve more accurate results to more complicated requests.
That takes us to one of the “superior” digital assistants, Alexa, which is featured on the Amazon Echo smart speakers. Indeed, Alexa and the Google Assistant are supposed to represent the cutting edge of voice recognition and response technology.
Apple is often urged to maybe relent on online privacy and deliver a smarter and more dependable Siri. But maybe that’s not the right idea after all.
So there’s a published report of the results of an overeager Alexa, which confirmed the worst fears about such digital voice assistants. The act of recording someone’s private conversation and emailing it to someone, even from their contact lists, is the worst definition of eavesdropping. I suspect intelligence agencies might be salivating over the ease with which this stunt can be pulled off.
As you might expect, the family contacted Amazon “multiple times,” according to a published report, and conversed with one of the Alexa engineers, who looked into the matter to figure out what went wrong. In the end, the existence of a bug was confirmed.
According to Amazon’s statement, “Amazon takes privacy very seriously. We investigated what happened and determined this was an extremely rare occurrence. We are taking steps to avoid this from happening in the future.”
Well, you can hardly expect them to say anything else.
Now I want to be fair to Amazon, and perhaps it was just a glitch as they claimed, one that they will or have already fixed. But how often has this happened, and had there not been publicity about this particular episode, would anything have been done other than perhaps make some excuses to the victims?
To be blunt: Amazon does a fine job delivering merchandise at affordable prices, but its customer service, largely outsourced, is not easy to deal with. Whether a chat or a phone call, you often have to explain and re-explain the problem several times for the basics to be understood.
That doesn’t mean Amazon is being careless about Alexa and how it works as the frontend to a smart speaker. Again, I am not suggesting this mishap was anything more than a rare system glitch of some sort.
One article I read on Alexa’s inadvertent attempt at spying tried to connect it to Apple and the HomePod, and whether it, too, might accidentally record someone’s personal conversation and email it to someone. But that’s not the province of Apple’s smart speaker; we benefit from the fact that it was not designed to record your random conversations in the course of isolating a request.
Maybe you’d rather not have HomePod laden with too many features after all, however useful it might seem to some users.
In yesterday’s column, I expressed my deep concerns about elements of Consumer Reports’ testing process. It was based on an article from AppleInsider. I eagerly awaited part two, hoping that there would be at least some commentary about the clear shortcomings in the way the magazine evaluates tech gear.
I also mentioned two apparent editorial glitches I noticed, in which product descriptions and recommendations contained incorrect information. These mistakes were obvious with just casual reading, not careful review. Clearly CR needs to beef up its editorial review process. A publication with its pretensions needs to demonstrate a higher level of accuracy.
Unfortunately, AppleInsider clearly didn’t catch the poor methodology used to evaluate speaker systems. As you recall, they use a small room, and crowd the tested units together without consideration of placement, or the impact of vibrations and reflections. The speakers should be separated, perhaps by a few feet, and the tests should be blind, so that the listeners aren’t prejudiced by the look or expectations for a particular model.
CR’s editors claim not to be influenced by appearance, but they are not immune to the effects of human psychology, and the factors that might cause them to give one product a better review than another. Consider, for example, the second part of a blind test, which is level matching. All things being equal, a system a tiny bit louder (a fraction of a dB) might seem to sound better.
I don’t need to explain why.
Also, I was shocked that CR’s speaker test panel usually consists of just two people with some sort of unspecified training so they “know” what loudspeakers should sound like. A third person is only brought in if there’s a tie. Indeed calling this a test panel, rather than a couple of testers or a test duo or trio, is downright misleading.
Besides, such a small sampling doesn’t consider the subjective nature of evaluating loudspeakers. People hear things differently, people have different expectations and preferences. All things being equal, even with blind tests and level matching, a sampling of two or three is still not large enough to get a consensus. A large enough listening panel, with enough participants to reveal a trend, might, but the lack of scientific controls from a magazine that touts accuracy and reliability is very troubling.
I realize AppleInsider’s reporters, though clearly concerned about the notebook tests, were probably untutored about the way the loudspeakers were evaluated, and the serious flaws that make the results essentially useless.
Sure, it’s very possible that the smart speakers from Google and Sonos are, in the end, superior to the HomePod. Maybe a proper test with a large enough listener panel and proper setup would reveal such a result. So far as I’m concerned, however, CR’s test process is essentially useless on any system other than those with extreme audio defects, such as excessive bass or treble
I also wonder just how large and well equipped the other testing departments are. Remember that magazine editorial departments are usually quite small. The consumer publications I wrote for had a handful of people on staff, and mostly relied on freelancers. Having a full-time staff is expensive. Remember that CR carries no ads. Income is mostly from magazine sales, plus the sale of extra publications and services, such as a car pricing service, and reader donations. In addition, CR requires a multimillion dollar budget to buy thousands of products at retail every year.
Sure, cars will be sold off after use, but even then there is a huge loss due to depreciation. Do they sell their used tech gear and appliances via eBay? Or donate to Goodwill?
Past the pathetic loudspeaker test process, we have their lame notebook battery tests. The excuse for why they turn off browser caching doesn’t wash. To provide an accurate picture of what sort of battery life consumers should expect under normal use, they should perform tests that don’t require activating obscure menus and/or features that only web developers might use.
After all, people who buy personal computers will very likely wonder why they aren’t getting the battery life CR achieved. They can’t! At the end of the day, Apple’s tests of MacBook and MacBook Pro battery life, as explained in the fine print at its site, are more representative of what you might achieve. No, not for everyone, but certainly if you follow the steps listed, which do represent reasonable, if not complete, use cases.
It’s unfortunate that CR has no competition. It’s the only consumer testing magazine in the U.S. that carries no ads, is run by a non-profit corporation, and buys all of the products it tests anonymously via regular retail channels. Its setup conveys the veneer of being incorruptible, and thus more accurate than the tests from other publications.
It does seem, from the AppleInsider story, that the magazine is sincere about its work, though perhaps somewhat full of itself. If it is truly honest about perfecting its testing processes, however, perhaps it should reach out to professionals in the industries that it covers and refine its methodology. How CR evaluates notebooks and speaker systems raises plenty of cause for concern.
AppleInsider got the motherlode. After several years of back and forth debates about its testing procedures, Consumer Reports magazine invited the online publication to tour their facilities in New York. On the surface, you’d think the editorial stuff would be putting on their best face to get favorable coverage.
And maybe they will. AppleInsider has only published the first part of the story, and there are apt to be far more revelations about CR’s test facilities and the potential shortcomings in the next part.
Now we all know about the concerns: CR finds problems, or potential problems, with Apple gear. Sometimes the story never changes, sometimes it does. But the entire test process may be a matter of concern.
Let’s take the recent review that pits Apple’s HomePod against a high-end Google Home Max, which sells for $400 and the Sonos One. In this comparison, “Overall the sound of the HomePod was a bit muddy compared with what the Sonos One and Google Home Max delivered.”
All right, CR is entitled to its preferences and its test procedures, but let's take a brief look at what AppleInsider reveals about them.
So we all know CR claims to have a test panel that listens to speakers set up in a special room that, from the front at least, comes across as a crowded audio dealer with loads of gear stacked up one against another. Is that the ideal setup for a speaker system that’s designed to adapt itself to a listening room?
Well, it appears that the vaunted CR tests are little better than what an ordinary subjective high-end audio magazine does, despite the pretensions. The listening room, for example, is small with a couch, and no indication of any special setup in terms of carpeting or wall treatment. Or is it meant to represent a typical listening room? Unfortunately, the article isn’t specific enough about such matters.
What is clear is that the speakers, the ones being tested and those used for reference, are placed in the open adjacent to one another. There’s no attempt to isolate the speakers to prevent unwanted reflections or vibrations.
Worse, no attempt is made to perform a blind test, so that a speaker’s brand name, appearance or other factors doesn’t influence a listener’s subjective opinion. For example, a large speaker may seem to sound better than a small one, but not necessarily because of its sonic character. The possibility of prejudice, even unconscious, against one speaker or another, is not considered.
But what about the listening panel? Are there dozens of people taking turns to give the speakers thorough tests? Not quite. The setup involves a chief speaker tester, one Elias Arias, and one other tester. In other words, the panel consists of just two people, a testing duo, supposedly specially trained as skilled listeners in an unspecified manner, with a third brought in in the event of a tie. But no amount of training can compensate for the lack of blind testing.
Wouldn’t it be illuminating if the winning speaker still won if you couldn’t identify it? More likely, the results might be very different. But CR often appears to live in a bubble.
Speakers are measured in a soundproof room (anechoic chamber). The results reveal a speaker’s raw potential, but it doesn’t provide data as to how it behaves in a normal listening room, where reflections will impact the sound that you hear. Experienced audio testers may also perform the same measurements in the actual listening location, so you can see how a real world set of numbers compares to what the listener actually hears.
That comparison with the ones from the anechoic chamber might also provide an indication how the listening area impacts those measurements.
Now none of this means that the HomePod would have seemed less “muddy” if the tests were done blind, or if the systems were isolated from one another to avoid sympathetic vibrations and other side effects. It might have sounded worse, the same, or the results might have been reversed. I also wonder if CR ever bothered to consult with actual loudspeaker designers, such as my old friend Bob Carver, to determine the most accurate testing methods.
It sure seems that CR comes up with peculiar ways to evaluate products. Consider tests of notebook computers, where they run web sites from a server in the default browser with cache off to test battery life. How does that approach possibly represent how people will use these notebooks in the real world?
At least CR claims to stay in touch with manufacturers during the test process, so they can be consulted in the event of a problem. That approach succeeded when a preliminary review of the 2016 MacBook Pro revealed inconsistent battery results. It was strictly the result of that outrageous test process.
So turning off caching in Safari’s usually hidden Develop menu revealed a subtle bug that Apple fixed with a software update. Suddenly a bad review become a very positive review.
Now I am not going to turn this article into a blanket condemnation of Consumer Reports. I hope there will be more details about testing schemes in the next part, so the flaws — and the potential benefits — will be revealed.
In passing, I do hope CR’s lapses are mostly in the tech arena. But I also know that their review of my low-end VW claimed the front bucket seats had poor side bolstering. That turned out to be totally untrue.
CR’s review of the VIZIO M55-E0 “home theater display” mislabeled the names of the setup menu’s features in its recommendations for optimal picture settings. It also claimed that no printed manual was supplied with the set; this is half true. You do receive two Quick Start Guides in multiple languages. In its favor, most of the picture settings actually deliver decent results.
Does anyone recall ever benefiting because one company merged with another? It’s not necessarily similar to Apple’s purchase of Beats and selling expensive headphones, because that deal was more about acquiring technology, which is something that’s been done for years.
So consider the act that saved Apple, acquiring NeXT in 1996, which brought a state-of-the-art Unix-based OS that, over the years, morphed into macOS and iOS. That move came in the wake of the failure of Copland, Apple’s own effort to build a successor to Mac OS. It took a while to jell, but here in 2018, we are still benefiting from the fruits of that transaction.
It also brought Steve Jobs back to Apple, and the rest is history.
From Apple A-series processors, to Touch ID, Face ID and — yes — even Siri, Apple’s ongoing acquisitions of technology companies have delivered compelling features that have advanced the company, and enhanced the user experiences of hundreds of millions of customers.
But when two companies consummate a normal merger, there are almost always promises of realizing synergies, and somehow benefiting customers. In the end, the stockholders and the executives become richer, but people lose their jobs because they are deemed redundant. With fewer competitors, prices just may increase.
When buying a company with different products and services, it may be easier to get approval from the powers-that-be in the U.S. government. Even then, there may be restrictions to reduce corporate excesses of one sort or another. When Comcast, the number one cable and broadband company in the U.S., completed its acquisition of NBC/Universal in 2011, the deal came with restrictions to ensure fair treatment to competing companies.
So Comcast needed to be fair with in negotiating carrier deals to carry NBC content, which includes such cable networks as Bravo, CNBC, MSNBC, SyFy and USA.
Over the years, I’ve heard all sorts of tech support horror stories from Comcast cable customers. There’s no indication things became any better after the merger. Of course, the entertainment division isn’t involved in direct interactions with individual consumers.
When AT&T bought DirecTV, the world’s largest satellite TV network, in 2015, the support systems were combined. Not only were jobs lost, but service got a whole lot worse. These days, when I dial up AT&T for satellite or wireless support, I have to navigate through a mostly deaf voice assistant, and I’m often forced to talk to several people just to resolve a simple issue. How does that save money?
I remain a customer for two reasons. First no other TV service is available at this apartment, which is wired for CenturyLink, and includes a single DirecTV satellite dish feeding all the units in each building. Reception via an interior digital antenna is hit or miss. Second, although it was hard to find, I receive an AARP discount for AT&T wireless, and that discount is enough to match T-Mobile’s “Uncarrier” price.
Speaking of which, AT&T attempted to merge with T-Mobile in 2011, but the government said no. Forced to compete on its own terms, T-Mobile began its “Uncarrier” promotion, which did away with standard two-year contracts, and overhauled the industry.
As a result, your wireless bill is no doubt cheaper regardless of the carrier. T-Mobile is growing rapidly; the move spurred Sprint to slash its prices, so both became more compelling alternatives to market leaders Verizon and AT&T.
Now T-Mobile and Sprint are trying to become one. But T-Mobile’s flashy CEO, John Legere, insisted that there will still be more competition in the market than most believe: “This isn’t a case of going from four to three wireless companies—there are now at least seven or eight big competitors in this converging market.”
Or maybe not.
True, cable providers are entering or planning to enter the cell phone market, but it’s not at all likely that they’ll suddenly became major competition for the big four — make that big three if this merger is consummated.
At the same time, it is true that T-Mobile and Sprint together will provide healthier competition to the Verizon and AT&T. As it stands, T-Mobile has good cellular coverage in larger cities but relatively poor coverage in rural areas. A larger footprint will also provide more network resources and revenue to speed deployment of 5G networks.
In theory, that should be a good thing.
Then again, as Sprint learned when it bought Nextel in 2005, combining two incompatible networks is no easy task. Basically Nextel was shunted to the side in the wreckage of that deal.
So T-Mobile uses GSM, same as AT&T. Sprint uses CDMA, same as Verizon. Sprint claims some 20 million customers have handsets that are compatible with T-Mobile, which will be the winning company. After a migration period to the combined service, which will take from two to three years, it’ll still leave millions of users with incompatible handsets, unless the equipment supports LTE and is deployed in an area where there’s an acceptable LTE signal. I just hope there will be special discounts for people with bricked phones to upgrade.
While Legere also claimed that more employees will be needed with the combined company, that may be a tricky response. Perhaps there will be, workers to perform the hardware migration and upgrades. But what about sales and support people? How many of them will be getting pink slips? Doesn’t it make sense that there will be thousands of redundant positions, or does T-Mobile expect many of these employees will be willing to transfer to the hardware division?
Will prices really go down?
Of course, this deal hasn’t been Okayed by the authorities, and there may be restrictions to protect customers with potentially obsolete gear among other things. It would be nice to see guarantees that prices won’t increase, but such restrictions are usually temporary. What will the market be like in five years?
I am, however, pleased that the new company will be in T-Mobile’s image and not Sprint’s. I tried Sprint in the early 2000s, before switching to AT&T. As bad as the latter’s support is now, Sprint was far, far worse.
Regular readers know that I’ve spent an awful lot of time correcting fake news about Apple. Is it because I’m an Apple fanboy? No, it’s more about my obsessive nature. Without claiming that I’m in any way perfect, I dislike reading false information about anyone or anything. I’m very much in favor of reporting the news as accurately as possible and correcting mistakes when it’s necessary, even very slight ones.
In a sense, then, these columns are very much works in process. When the story changes, or I discover a typo, I update. It’s one of the good things about the Internet, but it also makes it easier to post falsehoods without much in the way of consequences. It’s just more clutter, and there’s so much of that you can barely keep up.
Now maybe there is hope. Last week’s revelation that the iPhone X, even through the March quarter, was Apple’s best selling smartphone and, in fact, the best-selling smartphone on planet Earth, would surely have convinced the naysayers that they were wrong all along about sales collapsing. Or maybe not. The nonsense about supposed negative supply chain data pointing to poor sales of Apple gear has long ago been disproven.
But some people never learn, or maybe there’s an advantage in saying bad things about Apple, even though a lot of those statements are outright lies.
In any case, on this week’s episode of The Tech Night Owl LIVE, we presented commentator Josh Centers, Managing Editor for TidBITS, and author of “Take Control of Apple TV” and other titles, who focused a main part of his conversation with Gene on Apple’s record earnings for the March 2018 quarter. Despite all the unfounded rumors of poor iPhone X sales, which hurt the company’s stock price for several weeks, Apple reported that its flagship smartphone was its top-selling gadget for every week it was on sale — and thus the top-selling mobile handset on the planet. You also heard about Apple’s decision to discontinue AirPort Wi-Fi routers, why it may have occurred, and possible alternatives. And what about the announcement that, once again, T-Mobile and Sprint are attempting a merger. Will the attempt succeed this time with a different administration in Washington? Will customers receive better service, and how will prices be impacted? What about the fate of employees of both companies, and merging two incompatible cellular networks. Josh also explained why, for now, he’s basically stuck with Verizon Wireless in the rural area in which he lives.
You also heard from outspoken columnist Bryan Chaffin, co-founder and co-publisher of The Mac Observer, who explained why false rumors about alleged poor iPhone X sales got his dander up. Gene and Bryan talked at length about such fake stories, and how Apple actually fared during the March quarter compared to last year. There were also discussions about the proposed T-Mobile/Sprint merger, and how the plan differs from AT&T’s plans to join forces with Time Warner. Will the political winds in Washington force AT&T to ditch CNN to get the merger approved by the Department of Justice? There was also a discussion about the news that Twitter has asked its entire membership to change their passwords because of a purported error in storing them internally in plan text. Twitter claimed outsiders were not impacted, but that didn’t stop Gene from immediately changing his password.
On this week’s episode of our other radio show, The Paracast: Gene and guest co-host Michael Allen present a return visit by researcher MJ Banias, a blogger who critically and philosophically examines the weird, the strange and the anomalous. During this episode, MJ will discuss the latest episode of the “MUFON Follies,” a new documentary about the Flatwoods Monster, a creature seen in West Virginia in 1952, and even how he accidentally got involved in debates over the Billy Meier contacts. And what about the alleged alien agenda? MJ was a former field investigator with MUFON, has been featured on multiple podcasts and radio shows, and contributes to Mysterious Universeand RoguePlanet. His work has been included in FATEMagazine, and in a collection of UFO-related essays entitled UFOs: Reframing the Debate.
I REMEMBER THE iMAC
In 1998, the typical Mac was a large beige desktop, or a black PowerBook. Simple, conservative, powerful. In those days the PowerPC roasted Intel Pentiums for lunch. It took years for the PowerPC’s reign as the fastest PC processor to end.
In May of that year, Steve Jobs announced a revolution in personal computing — with an emphasis on simple Internet access — the iMac. It didn’t ship until August of that year, but I already had one in my home. As a member of Apple’s Customer Quality Feedback program, I was beta testing the original Bondi Blue iMac. It would go on sale for $1,299, but my Apple contact told me I could keep it if it survived a final firmware update.
I wasn’t surprised to see it didn’t, and thus I sent it back for, they told me, proper disposal. But armed with that experience, and with Apple’s approval, I wrote an article about iMac for a Phoenix newspaper, which included an interview with none other than Jonathan Ive.
In retrospect, the iMac was a revolution, setting the stage for future Macs, but to me it was just a low-end consumer all-in-one computer. It took a while to see the method in Apple’s madness. To me it didn’t provide the higher end features I needed for my work.
As a practical matter, though, its 233 MHz PowerPC G3 was as powerful as the one offered in the most expensive Power Macintosh minitower the previous year, although many of its parts came from the PowerBook. Just after Apple finally got the RAM upgrade process simplified for Macs, doing it on an iMac required pulling out the internal chassis. Not hard, but an awkward process.
But this wasn’t about easy upgrades. It was about having a computer that you could connect to a power outlet and a phone jack, turn it on and log in. Suddenly, online access was easy. I was an old hand at getting online, so it wasn’t so big a deal for me, but I can see where millions of potential customers would find it a revelation. To me, however, the iMac was almost an alien visitor. There was no LocalTalk port, no SCSI port, no floppy drive. But the addition of a USB port — an Intel invention in fact — paved the way for the future.
It didn’t take long for peripheral makers to go USB. The 1.0 version made for slow hard drives, but you didn’t have to mess with SCSI chains, incompatible devices, and terminators. Printers, scanners and other accessories worked just fine, and you can’t imagine how this simplified the connection process.
PC makers didn’t understand when it was time to give up on old technology, and thus the boxes had lots of legacy ports, and you had to juggle with cables, driver incompatibilities and so forth. An iMac? It just worked, but it was still just a low-end computer that would be fine for online access and word processing. You couldn’t imagine working with Photoshop or playing games on it, though the former would run all right enough despite the poky internal drive.
Over the next 20 years, you would see evidence that Apple had a long-range plan. The iMac went through several design generations before it became what was essentially a monitor with a rear-end that became fat in the center.
The 27-inch iMac, in 2009, was a powerhouse. For most tasks, performance was on a par with the hefty cheese grater Mac Pro, and only fell behind with apps that worked best with a least 8 cores inside. Graphics performance was decent, and the large display was awesome for its time.
It was enough to convince me to sell a slightly older Mac Pro and a large Dell display. I was able to sell the system to a friend, and use the money for a brand new fully-outfitted iMac and a backup drive, and still have a few hundred dollars left to pay some bills.
By 2014, an iMac arrived with the PC industry’s best display — ever — with a resolution of 5K. It allowed you to edit a 4K movie in Final Cut Pro, with enough space left on the screen for the menus. While the first model cost a few hundred dollars more than an iMac with the regular display, it wasn’t long before Apple found ways to build those marvelous 5K displays cheaper, with color improvements. Thus all 27-inch iMacs received 5K displays, with no increase in price.
The PC world was left hanging. Go online and find a 5K standalone display, other than the one LG built with Apple’s assistance. Now find one that’s actually affordable, and seek a PC with the graphics power to drive one without fiddling with multiple cables.
In 2017 released a high-end iMac, the Pro, with a rejiggered cooling system capable of supporting an Intel Xeon processor with up to 18 cores plus EEG memory. The prices started at just below $5,000 and soared into the five figures. Finding a PC with comparable specs wouldn’t save you any money, and configuring one with a 5K display, other than the one from LG, turns it into an even more expensive proposition.
From its humble beginnings in 1998 as a simple consumer-level all-in-one computer to the most powerful Mac on the planet — at least until the next Mac Pro arrives — has to be an amazing achievement. The iMac Pro is designed to handle high-end scientific tasks, movie special effects rendering, and other tasks that are required of the most powerful PC workstations.
But there’s still a cheap iMac available. You can buy a 21.5-inch model, with standard definition display, for $1,099. One with a 4K display is just $200 more, the same price as the iMac of 20 years ago. But if you count for 20 years of inflation, the $1,299 you paid for the Bondi Blue iMac in 1998 would be worth $1,984.29 today.
Even if I could afford the 2019 Mac Pro when it arrives, the iMac remains my computer of choice. Well, until it’s replaced by something cheaper and better.