Over the years, some tech pundits have decided that Apple really needs to drop the Mac. To them, it has outlived its usefulness and, besides, far more money is made from selling iPhones.
But it’s a good source of hit bait to claim that “Mac users don’t really matter to Apple.”
Indeed, Apple has, at times, made it seem as if that claim was accurate. The Mac mini has not been refreshed since 2014. After releasing a total redesign for the Mac Pro in late 2013, Apple appeared to drop the ball and mostly abandoned that model.
When a new MacBook Pro was launched in late 2016, some thought the claim that it was a professional notebook was a huge exaggeration. It was thinner, in the spirit of recent Apple gear, but the highly touted Touch Bar, powered by an ARM system-on-a-chip, was thought to be fluff and not much else.
Apple also got dinged for things it had never done, such as supplying a model with 32GB of RAM. But that would have required using a different memory controller that might have impacted performance and battery life. In comparison, most PC notebooks were also limited to 16GB. A future Intel CPU update will offer an integrated memory controller that doubles memory capacity.
Just after Christmas, a Consumer Reports review failed to recommend the 2016 MacBook Pro supposedly due to inconsistent battery life. After Apple got involved, it turned out that CR’s peculiar testing scheme, which involves disabling the browser cache, triggered a rare bug. After Apple fixed it, a retest earned the MacBook Pro an unqualified recommendation.
Was all this proof that Apple just didn’t care about Macs?
Well, it’s a sure thing the Touch Bar wasn’t cheap to develop, and embedding an ARM chip in a Mac is definitely innovative. But Apple’s priorities appeared to have gone askew, as the company admitted during a small press roundtable in early 2017.
The executive team made apologies for taking the Mac Pro in the wrong direction, and promised that a new model with modular capabilities was under development, but it wouldn’t ship right away. There would, however, be a new version of the iMac with professional capabilities. VP Philip Schiller spoke briefly about loving the Mac mini, but quickly changed the subject.
Before the 2017 WWDC, I thought that Apple would merely offer more professional parts for customized 27-inch 5K iMacs. But such components as Intel Xeon-W CPUs and ECC memory would exceed that model’s resource threshold. So Apple extensively redesigned the cooling system to support workstation-grade parts.
The 2017 iMac Pro costs $4,999 and up, the most expensive, and most powerful, iMac ever. You can only upgrade RAM, but it’s a dealer only installation since it requires taking the unit completely apart, unlike the regular large iMac, where memory upgrades are a snap.
Apple promised that a new Mac Pro, which would meet the requirements of pros who want a box that’s easy to configure and upgrade, would appear in 2019, so maybe it’ll be demonstrated at a fall event where new Macs are expected.
But Apple surely wouldn’t have made the commitment to expensive Macs if it didn’t take the platform — and Mac users — seriously. The iMac Pro itself represents a significant development in all-in-one personal computers.
Don’t forget that the Mac, while dwarfed by the iPhone, still represents a major business for Apple. Mac market share is at its highest levels in years in a declining PC market, serving tens of millions of loyal users. When you want to develop an app for iOS, tvOS or watchOS, it has to be done on a Mac. That isn’t going to change. In addition, Apple is porting several iOS apps for macOS Mojave, and developers will have the tools to do the same next year.
According to software head Craig Federighi, iOS and macOS won’t merge and the Mac will not support touchscreens.
Sure, the Mac may play second fiddle to the iPhone, but that doesn’t diminish the company’s commitment to the platform. But it’s still easy for fear-mongering tech pundits to say otherwise, perhaps indirectly suggesting you shouldn’t buy a Mac because it will never be upgraded, or that upgrades will be half-hearted.
Perhaps there’s an ulterior motive behind some of those complaints; they are designed to discourage people from buying Macs and pushing them towards the latest PC boxes that, by and large, look the same as the previous PC boxes with some upgraded parts.
But since Intel has run late with recent CPU upgrades, Apple has often been forced to wait for the right components before refreshing Macs. That doesn’t excuse the way the Mac mini and the MacBook Air have been ignored, but I’ll cut Apple some slack with the Mac Pro, since a major update has been promised for next year.
Now this doesn’t mean the Mac isn’t going to undergo major changes in the coming years. Maybe Apple is becoming disgusted with Intel’s growing problems in upgrading its CPUs, and will move to ARM. Maybe not. But that’s then, this is now.
A few years back, I embarked on upgrade mission, to swap out the slow hard drive on my 2009 27-inch iMac and replace it with a nice and speedy SSD. With the cooperation of Larry O’Connor of Other World Computing, I got ahold of a 1TB drive and an upgrade kit, consisting of a few tools and suction caps. The latter was used to pry the display from the chassis.
Once the glass is extracted it’s supposed to be placed on a soft surface — I put it on a bed — the rest of the job largely involved carefully unhooking several thin wiring harnesses, easily damaged, and the drive. The manufacturer provides an adapter cable to make the new drive compatible with the iMac.
All told, it took about an hour to get through the process and reassemble the computer. O’Connor’s company offers installation videos on his site to simplify the process.
The reason I bring this up is the result of the first interview on this episode of The Tech Night Owl LIVE, where we were joined by tech columnist Rob Pegoraro, who writes for USA Today, Yahoo Finance, Wirecutter and other publications. At the beginning of this segment, Rob explained that he took apart his vintage 27-inch iMac, from 2009, in order to replace the drive with an SSD from Other Word Computing. Gene shared his experiences in upgrading a similar computer several years ago. In later iMacs, it’s held together with an adhesive strip, making the disassembly and reassembly process far more complicated. There was also a discussion about Siri’s voice recognition problems, and a recent report that someone’s Amazon Echo Dot, featuring Alexa, recorded a personal conversation and sent the file to a contact in another city.
Can we trust these digital assistants to respect our privacy? Rob also talked about a meeting with security experts discussing changes and possible improvements in online security over the past 20 years.
The Amazon scandal is also discussed in the next article.
After the interview with Rob was recorded, I contacted two local authorized third-party Apple repair shops as to whether they’d be able to upgrade the drive on a more recent 27-inch iMac and how much it would cost. The process involves removing the adhesive that holds the display to the chassis. It’s not something I’d care to tackle.
Well, the first dealer gave a flat no, saying that even trying would damage the computer. That didn’t sound right to me, since Apple uses a similar process to upgrade memory on the iMac Pro. It can’t be upgraded as simply as the regular large iMac, which has a RAM cover at the bottom. Maybe that particular dealer didn’t want to bother or had a bad experience or two.
A second dealer gave me a detailed quote that included labor, two adapters from Other World Computing, plus backup and restore. It came to $457.93!
When I looked at the numbers, though, it sort of made sense, since they charge $200 for a full backup and restore, $19.99 for the replacement adhesive strip, and $79.99 for the needed OWC and Newer adaptors. The actual labor comes to $150. OWC sells SSDs with the proper adaptors and the customer can always restore the data themselves, so the price could be as “low” as $169.99.
In a special encore presentation, you heard a vintage segment featuring Ben Williams of Adblock Plus. Ad blocking has experienced a lot of activity over the past year, especially since Google entered the fray with its ad filter for Chrome. There are still battles between publishers and ad blockers, and payment systems to publishers from users are being talked about with more frequency. Gene and Ben also engaged in an extended discussion about the value of online advertising, and the long history of making it as offensive as possible. There was also a fun pop culture discussion, about ads that build branding images based on using a well-known personality, such as Oscar winning actor J.K. Simmons, known for Farmers Insurance commercials and loads of movies and TV shows, including the recent comic book film, “Justice League,” where he played Commissioner Gordon. You also learned how ad blockers can be configured to allow ads that have been approved by Adblock Plus.
On this week’s episode of our other radio show, The Paracast: Gene and guest cohost Goggs Mackay present Dr. Jack Hunter, an anthropologist and author of “Engaging the Anomalous: Collected Essays on Anthropology, the Paranormal, Mediumship, and Extraordinary Experience.” In this book, Dr. Hunter poses serious questions about consciousness, experience, spirits, mediumship, psi, the nature of reality, and how best to investigate and understand them.
In this discussion, Dr. Hunter will present stories of personal experiences, encounters with mediums, and float a wide variety of suggestions as to how various paranormal phenomena might somehow be connected, and that includes the UFO mystery. Dr. Hunter is the founder and editor of a free online journal, Paranthropology.
SSSSHHHH: ALEXA IS LISTENING
Let me start with the Siri follies.
With growing concern that Apple’s Siri digital assistant isn’t capable of matching the competition from Amazon and Google, there are rumors that the next WWDC will feature news of a major refresh. Last year, Apple touted that Siri would receive a new voice and machine learning, but it’s not at all certain there has been much change beyond a smoother conversational tone.
A recent published report featured expressions of sour grapes from former Siri employees who worked at Apple, plus a claim that it worked fine when reporters tested it before it went public. But after it was launched, beginning with the iPhone 4s in 2011, Siri’s bugs were legion. Maybe it just couldn’t cope with massed requests under load.
The Night Owl’s personal experiences are hit or miss. Despite the fact that I have 25 years experience as a broadcaster, and a decade of voice training, Siri is sometimes deaf to me. A simple example is the request for Maps to navigate me to the location of the nearest Walmart. There happen to be several, a few miles apart, but Siri will only produce a list, and rarely does that list display the location I seek. I find it easier to search in Google and manually pick the store to which I want to travel.
But that process hardly makes it hands free. I have to stop somewhere first to make my selection. So I tend to focus on setting alarms or reminders, where Siri is mostly correct.
One excuse given for Siri’s subpar performance is that Apple doesn’t want to infringe on your privacy, so it doesn’t actively collect information about you that is pushed and stored beyond the device itself. The theory goes that, if access to your device and requests were more open, since Siri resides online, you’d achieve more accurate results to more complicated requests.
That takes us to one of the “superior” digital assistants, Alexa, which is featured on the Amazon Echo smart speakers. Indeed, Alexa and the Google Assistant are supposed to represent the cutting edge of voice recognition and response technology.
Apple is often urged to maybe relent on online privacy and deliver a smarter and more dependable Siri. But maybe that’s not the right idea after all.
So there’s a published report of the results of an overeager Alexa, which confirmed the worst fears about such digital voice assistants. The act of recording someone’s private conversation and emailing it to someone, even from their contact lists, is the worst definition of eavesdropping. I suspect intelligence agencies might be salivating over the ease with which this stunt can be pulled off.
As you might expect, the family contacted Amazon “multiple times,” according to a published report, and conversed with one of the Alexa engineers, who looked into the matter to figure out what went wrong. In the end, the existence of a bug was confirmed.
According to Amazon’s statement, “Amazon takes privacy very seriously. We investigated what happened and determined this was an extremely rare occurrence. We are taking steps to avoid this from happening in the future.”
Well, you can hardly expect them to say anything else.
Now I want to be fair to Amazon, and perhaps it was just a glitch as they claimed, one that they will or have already fixed. But how often has this happened, and had there not been publicity about this particular episode, would anything have been done other than perhaps make some excuses to the victims?
To be blunt: Amazon does a fine job delivering merchandise at affordable prices, but its customer service, largely outsourced, is not easy to deal with. Whether a chat or a phone call, you often have to explain and re-explain the problem several times for the basics to be understood.
That doesn’t mean Amazon is being careless about Alexa and how it works as the frontend to a smart speaker. Again, I am not suggesting this mishap was anything more than a rare system glitch of some sort.
One article I read on Alexa’s inadvertent attempt at spying tried to connect it to Apple and the HomePod, and whether it, too, might accidentally record someone’s personal conversation and email it to someone. But that’s not the province of Apple’s smart speaker; we benefit from the fact that it was not designed to record your random conversations in the course of isolating a request.
Maybe you’d rather not have HomePod laden with too many features after all, however useful it might seem to some users.
In yesterday’s column, I expressed my deep concerns about elements of Consumer Reports’ testing process. It was based on an article from AppleInsider. I eagerly awaited part two, hoping that there would be at least some commentary about the clear shortcomings in the way the magazine evaluates tech gear.
I also mentioned two apparent editorial glitches I noticed, in which product descriptions and recommendations contained incorrect information. These mistakes were obvious with just casual reading, not careful review. Clearly CR needs to beef up its editorial review process. A publication with its pretensions needs to demonstrate a higher level of accuracy.
Unfortunately, AppleInsider clearly didn’t catch the poor methodology used to evaluate speaker systems. As you recall, they use a small room, and crowd the tested units together without consideration of placement, or the impact of vibrations and reflections. The speakers should be separated, perhaps by a few feet, and the tests should be blind, so that the listeners aren’t prejudiced by the look or expectations for a particular model.
CR’s editors claim not to be influenced by appearance, but they are not immune to the effects of human psychology, and the factors that might cause them to give one product a better review than another. Consider, for example, the second part of a blind test, which is level matching. All things being equal, a system a tiny bit louder (a fraction of a dB) might seem to sound better.
I don’t need to explain why.
Also, I was shocked that CR’s speaker test panel usually consists of just two people with some sort of unspecified training so they “know” what loudspeakers should sound like. A third person is only brought in if there’s a tie. Indeed calling this a test panel, rather than a couple of testers or a test duo or trio, is downright misleading.
Besides, such a small sampling doesn’t consider the subjective nature of evaluating loudspeakers. People hear things differently, people have different expectations and preferences. All things being equal, even with blind tests and level matching, a sampling of two or three is still not large enough to get a consensus. A large enough listening panel, with enough participants to reveal a trend, might, but the lack of scientific controls from a magazine that touts accuracy and reliability is very troubling.
I realize AppleInsider’s reporters, though clearly concerned about the notebook tests, were probably untutored about the way the loudspeakers were evaluated, and the serious flaws that make the results essentially useless.
Sure, it’s very possible that the smart speakers from Google and Sonos are, in the end, superior to the HomePod. Maybe a proper test with a large enough listener panel and proper setup would reveal such a result. So far as I’m concerned, however, CR’s test process is essentially useless on any system other than those with extreme audio defects, such as excessive bass or treble
I also wonder just how large and well equipped the other testing departments are. Remember that magazine editorial departments are usually quite small. The consumer publications I wrote for had a handful of people on staff, and mostly relied on freelancers. Having a full-time staff is expensive. Remember that CR carries no ads. Income is mostly from magazine sales, plus the sale of extra publications and services, such as a car pricing service, and reader donations. In addition, CR requires a multimillion dollar budget to buy thousands of products at retail every year.
Sure, cars will be sold off after use, but even then there is a huge loss due to depreciation. Do they sell their used tech gear and appliances via eBay? Or donate to Goodwill?
Past the pathetic loudspeaker test process, we have their lame notebook battery tests. The excuse for why they turn off browser caching doesn’t wash. To provide an accurate picture of what sort of battery life consumers should expect under normal use, they should perform tests that don’t require activating obscure menus and/or features that only web developers might use.
After all, people who buy personal computers will very likely wonder why they aren’t getting the battery life CR achieved. They can’t! At the end of the day, Apple’s tests of MacBook and MacBook Pro battery life, as explained in the fine print at its site, are more representative of what you might achieve. No, not for everyone, but certainly if you follow the steps listed, which do represent reasonable, if not complete, use cases.
It’s unfortunate that CR has no competition. It’s the only consumer testing magazine in the U.S. that carries no ads, is run by a non-profit corporation, and buys all of the products it tests anonymously via regular retail channels. Its setup conveys the veneer of being incorruptible, and thus more accurate than the tests from other publications.
It does seem, from the AppleInsider story, that the magazine is sincere about its work, though perhaps somewhat full of itself. If it is truly honest about perfecting its testing processes, however, perhaps it should reach out to professionals in the industries that it covers and refine its methodology. How CR evaluates notebooks and speaker systems raises plenty of cause for concern.
AppleInsider got the motherlode. After several years of back and forth debates about its testing procedures, Consumer Reports magazine invited the online publication to tour their facilities in New York. On the surface, you’d think the editorial stuff would be putting on their best face to get favorable coverage.
And maybe they will. AppleInsider has only published the first part of the story, and there are apt to be far more revelations about CR’s test facilities and the potential shortcomings in the next part.
Now we all know about the concerns: CR finds problems, or potential problems, with Apple gear. Sometimes the story never changes, sometimes it does. But the entire test process may be a matter of concern.
Let’s take the recent review that pits Apple’s HomePod against a high-end Google Home Max, which sells for $400 and the Sonos One. In this comparison, “Overall the sound of the HomePod was a bit muddy compared with what the Sonos One and Google Home Max delivered.”
All right, CR is entitled to its preferences and its test procedures, but let's take a brief look at what AppleInsider reveals about them.
So we all know CR claims to have a test panel that listens to speakers set up in a special room that, from the front at least, comes across as a crowded audio dealer with loads of gear stacked up one against another. Is that the ideal setup for a speaker system that’s designed to adapt itself to a listening room?
Well, it appears that the vaunted CR tests are little better than what an ordinary subjective high-end audio magazine does, despite the pretensions. The listening room, for example, is small with a couch, and no indication of any special setup in terms of carpeting or wall treatment. Or is it meant to represent a typical listening room? Unfortunately, the article isn’t specific enough about such matters.
What is clear is that the speakers, the ones being tested and those used for reference, are placed in the open adjacent to one another. There’s no attempt to isolate the speakers to prevent unwanted reflections or vibrations.
Worse, no attempt is made to perform a blind test, so that a speaker’s brand name, appearance or other factors doesn’t influence a listener’s subjective opinion. For example, a large speaker may seem to sound better than a small one, but not necessarily because of its sonic character. The possibility of prejudice, even unconscious, against one speaker or another, is not considered.
But what about the listening panel? Are there dozens of people taking turns to give the speakers thorough tests? Not quite. The setup involves a chief speaker tester, one Elias Arias, and one other tester. In other words, the panel consists of just two people, a testing duo, supposedly specially trained as skilled listeners in an unspecified manner, with a third brought in in the event of a tie. But no amount of training can compensate for the lack of blind testing.
Wouldn’t it be illuminating if the winning speaker still won if you couldn’t identify it? More likely, the results might be very different. But CR often appears to live in a bubble.
Speakers are measured in a soundproof room (anechoic chamber). The results reveal a speaker’s raw potential, but it doesn’t provide data as to how it behaves in a normal listening room, where reflections will impact the sound that you hear. Experienced audio testers may also perform the same measurements in the actual listening location, so you can see how a real world set of numbers compares to what the listener actually hears.
That comparison with the ones from the anechoic chamber might also provide an indication how the listening area impacts those measurements.
Now none of this means that the HomePod would have seemed less “muddy” if the tests were done blind, or if the systems were isolated from one another to avoid sympathetic vibrations and other side effects. It might have sounded worse, the same, or the results might have been reversed. I also wonder if CR ever bothered to consult with actual loudspeaker designers, such as my old friend Bob Carver, to determine the most accurate testing methods.
It sure seems that CR comes up with peculiar ways to evaluate products. Consider tests of notebook computers, where they run web sites from a server in the default browser with cache off to test battery life. How does that approach possibly represent how people will use these notebooks in the real world?
At least CR claims to stay in touch with manufacturers during the test process, so they can be consulted in the event of a problem. That approach succeeded when a preliminary review of the 2016 MacBook Pro revealed inconsistent battery results. It was strictly the result of that outrageous test process.
So turning off caching in Safari’s usually hidden Develop menu revealed a subtle bug that Apple fixed with a software update. Suddenly a bad review become a very positive review.
Now I am not going to turn this article into a blanket condemnation of Consumer Reports. I hope there will be more details about testing schemes in the next part, so the flaws — and the potential benefits — will be revealed.
In passing, I do hope CR’s lapses are mostly in the tech arena. But I also know that their review of my low-end VW claimed the front bucket seats had poor side bolstering. That turned out to be totally untrue.
CR’s review of the VIZIO M55-E0 “home theater display” mislabeled the names of the setup menu’s features in its recommendations for optimal picture settings. It also claimed that no printed manual was supplied with the set; this is half true. You do receive two Quick Start Guides in multiple languages. In its favor, most of the picture settings actually deliver decent results.
Does anyone recall ever benefiting because one company merged with another? It’s not necessarily similar to Apple’s purchase of Beats and selling expensive headphones, because that deal was more about acquiring technology, which is something that’s been done for years.
So consider the act that saved Apple, acquiring NeXT in 1996, which brought a state-of-the-art Unix-based OS that, over the years, morphed into macOS and iOS. That move came in the wake of the failure of Copland, Apple’s own effort to build a successor to Mac OS. It took a while to jell, but here in 2018, we are still benefiting from the fruits of that transaction.
It also brought Steve Jobs back to Apple, and the rest is history.
From Apple A-series processors, to Touch ID, Face ID and — yes — even Siri, Apple’s ongoing acquisitions of technology companies have delivered compelling features that have advanced the company, and enhanced the user experiences of hundreds of millions of customers.
But when two companies consummate a normal merger, there are almost always promises of realizing synergies, and somehow benefiting customers. In the end, the stockholders and the executives become richer, but people lose their jobs because they are deemed redundant. With fewer competitors, prices just may increase.
When buying a company with different products and services, it may be easier to get approval from the powers-that-be in the U.S. government. Even then, there may be restrictions to reduce corporate excesses of one sort or another. When Comcast, the number one cable and broadband company in the U.S., completed its acquisition of NBC/Universal in 2011, the deal came with restrictions to ensure fair treatment to competing companies.
So Comcast needed to be fair with in negotiating carrier deals to carry NBC content, which includes such cable networks as Bravo, CNBC, MSNBC, SyFy and USA.
Over the years, I’ve heard all sorts of tech support horror stories from Comcast cable customers. There’s no indication things became any better after the merger. Of course, the entertainment division isn’t involved in direct interactions with individual consumers.
When AT&T bought DirecTV, the world’s largest satellite TV network, in 2015, the support systems were combined. Not only were jobs lost, but service got a whole lot worse. These days, when I dial up AT&T for satellite or wireless support, I have to navigate through a mostly deaf voice assistant, and I’m often forced to talk to several people just to resolve a simple issue. How does that save money?
I remain a customer for two reasons. First no other TV service is available at this apartment, which is wired for CenturyLink, and includes a single DirecTV satellite dish feeding all the units in each building. Reception via an interior digital antenna is hit or miss. Second, although it was hard to find, I receive an AARP discount for AT&T wireless, and that discount is enough to match T-Mobile’s “Uncarrier” price.
Speaking of which, AT&T attempted to merge with T-Mobile in 2011, but the government said no. Forced to compete on its own terms, T-Mobile began its “Uncarrier” promotion, which did away with standard two-year contracts, and overhauled the industry.
As a result, your wireless bill is no doubt cheaper regardless of the carrier. T-Mobile is growing rapidly; the move spurred Sprint to slash its prices, so both became more compelling alternatives to market leaders Verizon and AT&T.
Now T-Mobile and Sprint are trying to become one. But T-Mobile’s flashy CEO, John Legere, insisted that there will still be more competition in the market than most believe: “This isn’t a case of going from four to three wireless companies—there are now at least seven or eight big competitors in this converging market.”
Or maybe not.
True, cable providers are entering or planning to enter the cell phone market, but it’s not at all likely that they’ll suddenly became major competition for the big four — make that big three if this merger is consummated.
At the same time, it is true that T-Mobile and Sprint together will provide healthier competition to the Verizon and AT&T. As it stands, T-Mobile has good cellular coverage in larger cities but relatively poor coverage in rural areas. A larger footprint will also provide more network resources and revenue to speed deployment of 5G networks.
In theory, that should be a good thing.
Then again, as Sprint learned when it bought Nextel in 2005, combining two incompatible networks is no easy task. Basically Nextel was shunted to the side in the wreckage of that deal.
So T-Mobile uses GSM, same as AT&T. Sprint uses CDMA, same as Verizon. Sprint claims some 20 million customers have handsets that are compatible with T-Mobile, which will be the winning company. After a migration period to the combined service, which will take from two to three years, it’ll still leave millions of users with incompatible handsets, unless the equipment supports LTE and is deployed in an area where there’s an acceptable LTE signal. I just hope there will be special discounts for people with bricked phones to upgrade.
While Legere also claimed that more employees will be needed with the combined company, that may be a tricky response. Perhaps there will be, workers to perform the hardware migration and upgrades. But what about sales and support people? How many of them will be getting pink slips? Doesn’t it make sense that there will be thousands of redundant positions, or does T-Mobile expect many of these employees will be willing to transfer to the hardware division?
Will prices really go down?
Of course, this deal hasn’t been Okayed by the authorities, and there may be restrictions to protect customers with potentially obsolete gear among other things. It would be nice to see guarantees that prices won’t increase, but such restrictions are usually temporary. What will the market be like in five years?
I am, however, pleased that the new company will be in T-Mobile’s image and not Sprint’s. I tried Sprint in the early 2000s, before switching to AT&T. As bad as the latter’s support is now, Sprint was far, far worse.
Regular readers know that I’ve spent an awful lot of time correcting fake news about Apple. Is it because I’m an Apple fanboy? No, it’s more about my obsessive nature. Without claiming that I’m in any way perfect, I dislike reading false information about anyone or anything. I’m very much in favor of reporting the news as accurately as possible and correcting mistakes when it’s necessary, even very slight ones.
In a sense, then, these columns are very much works in process. When the story changes, or I discover a typo, I update. It’s one of the good things about the Internet, but it also makes it easier to post falsehoods without much in the way of consequences. It’s just more clutter, and there’s so much of that you can barely keep up.
Now maybe there is hope. Last week’s revelation that the iPhone X, even through the March quarter, was Apple’s best selling smartphone and, in fact, the best-selling smartphone on planet Earth, would surely have convinced the naysayers that they were wrong all along about sales collapsing. Or maybe not. The nonsense about supposed negative supply chain data pointing to poor sales of Apple gear has long ago been disproven.
But some people never learn, or maybe there’s an advantage in saying bad things about Apple, even though a lot of those statements are outright lies.
In any case, on this week’s episode of The Tech Night Owl LIVE, we presented commentator Josh Centers, Managing Editor for TidBITS, and author of “Take Control of Apple TV” and other titles, who focused a main part of his conversation with Gene on Apple’s record earnings for the March 2018 quarter. Despite all the unfounded rumors of poor iPhone X sales, which hurt the company’s stock price for several weeks, Apple reported that its flagship smartphone was its top-selling gadget for every week it was on sale — and thus the top-selling mobile handset on the planet. You also heard about Apple’s decision to discontinue AirPort Wi-Fi routers, why it may have occurred, and possible alternatives. And what about the announcement that, once again, T-Mobile and Sprint are attempting a merger. Will the attempt succeed this time with a different administration in Washington? Will customers receive better service, and how will prices be impacted? What about the fate of employees of both companies, and merging two incompatible cellular networks. Josh also explained why, for now, he’s basically stuck with Verizon Wireless in the rural area in which he lives.
You also heard from outspoken columnist Bryan Chaffin, co-founder and co-publisher of The Mac Observer, who explained why false rumors about alleged poor iPhone X sales got his dander up. Gene and Bryan talked at length about such fake stories, and how Apple actually fared during the March quarter compared to last year. There were also discussions about the proposed T-Mobile/Sprint merger, and how the plan differs from AT&T’s plans to join forces with Time Warner. Will the political winds in Washington force AT&T to ditch CNN to get the merger approved by the Department of Justice? There was also a discussion about the news that Twitter has asked its entire membership to change their passwords because of a purported error in storing them internally in plan text. Twitter claimed outsiders were not impacted, but that didn’t stop Gene from immediately changing his password.
On this week’s episode of our other radio show, The Paracast: Gene and guest co-host Michael Allen present a return visit by researcher MJ Banias, a blogger who critically and philosophically examines the weird, the strange and the anomalous. During this episode, MJ will discuss the latest episode of the “MUFON Follies,” a new documentary about the Flatwoods Monster, a creature seen in West Virginia in 1952, and even how he accidentally got involved in debates over the Billy Meier contacts. And what about the alleged alien agenda? MJ was a former field investigator with MUFON, has been featured on multiple podcasts and radio shows, and contributes to Mysterious Universeand RoguePlanet. His work has been included in FATEMagazine, and in a collection of UFO-related essays entitled UFOs: Reframing the Debate.
I REMEMBER THE iMAC
In 1998, the typical Mac was a large beige desktop, or a black PowerBook. Simple, conservative, powerful. In those days the PowerPC roasted Intel Pentiums for lunch. It took years for the PowerPC’s reign as the fastest PC processor to end.
In May of that year, Steve Jobs announced a revolution in personal computing — with an emphasis on simple Internet access — the iMac. It didn’t ship until August of that year, but I already had one in my home. As a member of Apple’s Customer Quality Feedback program, I was beta testing the original Bondi Blue iMac. It would go on sale for $1,299, but my Apple contact told me I could keep it if it survived a final firmware update.
I wasn’t surprised to see it didn’t, and thus I sent it back for, they told me, proper disposal. But armed with that experience, and with Apple’s approval, I wrote an article about iMac for a Phoenix newspaper, which included an interview with none other than Jonathan Ive.
In retrospect, the iMac was a revolution, setting the stage for future Macs, but to me it was just a low-end consumer all-in-one computer. It took a while to see the method in Apple’s madness. To me it didn’t provide the higher end features I needed for my work.
As a practical matter, though, its 233 MHz PowerPC G3 was as powerful as the one offered in the most expensive Power Macintosh minitower the previous year, although many of its parts came from the PowerBook. Just after Apple finally got the RAM upgrade process simplified for Macs, doing it on an iMac required pulling out the internal chassis. Not hard, but an awkward process.
But this wasn’t about easy upgrades. It was about having a computer that you could connect to a power outlet and a phone jack, turn it on and log in. Suddenly, online access was easy. I was an old hand at getting online, so it wasn’t so big a deal for me, but I can see where millions of potential customers would find it a revelation. To me, however, the iMac was almost an alien visitor. There was no LocalTalk port, no SCSI port, no floppy drive. But the addition of a USB port — an Intel invention in fact — paved the way for the future.
It didn’t take long for peripheral makers to go USB. The 1.0 version made for slow hard drives, but you didn’t have to mess with SCSI chains, incompatible devices, and terminators. Printers, scanners and other accessories worked just fine, and you can’t imagine how this simplified the connection process.
PC makers didn’t understand when it was time to give up on old technology, and thus the boxes had lots of legacy ports, and you had to juggle with cables, driver incompatibilities and so forth. An iMac? It just worked, but it was still just a low-end computer that would be fine for online access and word processing. You couldn’t imagine working with Photoshop or playing games on it, though the former would run all right enough despite the poky internal drive.
Over the next 20 years, you would see evidence that Apple had a long-range plan. The iMac went through several design generations before it became what was essentially a monitor with a rear-end that became fat in the center.
The 27-inch iMac, in 2009, was a powerhouse. For most tasks, performance was on a par with the hefty cheese grater Mac Pro, and only fell behind with apps that worked best with a least 8 cores inside. Graphics performance was decent, and the large display was awesome for its time.
It was enough to convince me to sell a slightly older Mac Pro and a large Dell display. I was able to sell the system to a friend, and use the money for a brand new fully-outfitted iMac and a backup drive, and still have a few hundred dollars left to pay some bills.
By 2014, an iMac arrived with the PC industry’s best display — ever — with a resolution of 5K. It allowed you to edit a 4K movie in Final Cut Pro, with enough space left on the screen for the menus. While the first model cost a few hundred dollars more than an iMac with the regular display, it wasn’t long before Apple found ways to build those marvelous 5K displays cheaper, with color improvements. Thus all 27-inch iMacs received 5K displays, with no increase in price.
The PC world was left hanging. Go online and find a 5K standalone display, other than the one LG built with Apple’s assistance. Now find one that’s actually affordable, and seek a PC with the graphics power to drive one without fiddling with multiple cables.
In 2017 released a high-end iMac, the Pro, with a rejiggered cooling system capable of supporting an Intel Xeon processor with up to 18 cores plus EEG memory. The prices started at just below $5,000 and soared into the five figures. Finding a PC with comparable specs wouldn’t save you any money, and configuring one with a 5K display, other than the one from LG, turns it into an even more expensive proposition.
From its humble beginnings in 1998 as a simple consumer-level all-in-one computer to the most powerful Mac on the planet — at least until the next Mac Pro arrives — has to be an amazing achievement. The iMac Pro is designed to handle high-end scientific tasks, movie special effects rendering, and other tasks that are required of the most powerful PC workstations.
But there’s still a cheap iMac available. You can buy a 21.5-inch model, with standard definition display, for $1,099. One with a 4K display is just $200 more, the same price as the iMac of 20 years ago. But if you count for 20 years of inflation, the $1,299 you paid for the Bondi Blue iMac in 1998 would be worth $1,984.29 today.
Even if I could afford the 2019 Mac Pro when it arrives, the iMac remains my computer of choice. Well, until it’s replaced by something cheaper and better.
After several weeks of fake news about iPhone X sales, Apple revealed the truth. It was the company’s best-selling smartphone every single week it was on sale over two quarters. This is the first time Apple’s most expensive model achieved that level of sales.
This comes after all the fear-mongering that people wouldn’t pay for a mobile handset costing $999 and more, depending on the configuration. There were surveys demonstrating that a majority of potential customers would reject the costlier models, which is understandable. But with iPhones starting at $349, it only demonstrated that different people have different priorities and different budgets.
But the iPhone X still led the pack among iPhones. I’m sure this is clear to you.
Now I suppose some of you might be skeptical of Apple’s claims about revenue, profits, and the number of items shipped. But the company is following SEC requirements. Filing false reports could get them in a heap of trouble. Look up companies who have run afoul of that agency.
In short, it’s fair to say that Apple is reporting the truth, whereas some members of the media who have repeated the fictions about poor sales are clearly mistaken, or perhaps deliberately lying.
Some of the fake news about poor iPhone X sales allegedly originates from the supply chain. But Apple CEO Tim Cook has said on several occasions that you can’t take one or a few supply chain metrics and assume anything about sales. Apple will routinely adjust supply allocations among different manufacturers and, in some cases, manage inventory in different ways that will impact total shipments.
What’s most disturbing about the iPhone X is that false reports of poor sales are only the latest in a long stream of falsehoods published about the product.
Even when the iPhone X was referred to as an iPhone 8, there were claims that Apple had to make a critical last-minute design change because they couldn’t find a way to make a front-mounted Touch ID work embedded or beneath an edge-to-edge OLED display. The rumors were based on the alleged reason that Samsung put its fingerprint sensor at the rear of the unit.
Sure, Apple went to Face ID, but that feature was supposedly under development for several years. Regardless of the alleged limitations of an OLED display, Apple may have switched to facial recognition anyway. Indeed, there are reports it may ultimately replace Touch ID on all gear.
Once the rumors about facial recognition became more credible, the next effort at fear-mongering suggested it would present potential security problems, or maybe not even work so well. After all, Samsung has a similar feature that can be readily defeated with a digital photo, at least on the Galaxy S8 smartphone. I’m not at all sure at this point whether there are similar limitations on this year’s Galaxy S9, which supposedly has improved biometrics.
Even after Face ID proved to be extremely reliable — nobody claims perfection — there were the inevitable complaints that the iPhone X would be backordered for weeks or months, and thus, after it was introduced early in November of 2017, you wouldn’t be able to get one in time for the holidays.
Over the next few weeks, Apple managed to mostly catch up with orders. So in the days before Christmas, you still had a good chance of getting one on time.
That’s when the critics began to suggest sales had been underwhelming. Apple’s great experiment in fueling an alleged — and never confirmed — iPhone “super” upgrade cycle had failed.
When Tim Cook announced that the iPhone X was the best-selling iPhone and the best-selling smartphone on the planet for each week it was on sale in the December quarter, the next rumor had it that sales collapsed after the holidays, and March quarterly numbers would be perfectly awful.
It got to a point by mid-April that Apple’s stock price, which had approached $180 per share, plummeted to near $160. You can see the trend over at Yahoo Finance and similar sites.
After this week’s news from Apple that all these unfavorable reports were false, the stock price soared. It closed at $176.57 on Wednesday.
So is that the end of the latest cycle of spreading fake news about Apple? I doubt it. There were similar rumors about previous iPhones, using alleged supply chain cutbacks to fuel such claims. In each case, the rumors turned out to be false, only to return months later in full force.
One would think that, after this keeps happening, the reporters, bloggers and industry analysts who keep spreading this nonsense would learn a thing or to. Then again, if some of it is designed to talk down the stock price, and thus allow the instigators to buy the stock at a lower price before it increases again, you can expect it won’t stop.
I suppose some of these rumors may also have been started by Apple’s competitors. I would hope that the media won’t be fooled by such antics anymore.
But don’t bet on it.
Some feel that Apple should be doing more, producing a greater variety of products. After all, a company of its size ought to be able to deliver a far wider catalog of tech gear. To some it may be seriously underperforming based on its huge potential.
Take the expected decision, as announced last week, to discontinue AirPort routers. After all, Apple was a pioneer in that business, so why should it abandon it? One key reason may be that there is no longer a place for Apple’s entry into this market. If sales were good and profits were high, AirPort would surely have had further updates after the last one, in 2013. It was no doubt strictly a business decision.
Compare that to the Apple LaserWriter, one of the original products that heralded the desktop publishing revolution. Equipped with Adobe PostScript, a LaserWriter was a mainstay for businesses, an expensive mainstay.
By 1997, when the LaserWriter was killed by Steve Jobs, it was hardly a unique product. There were plenty of equivalent printers available, all compatible with Macs, and Apple needed to ditch underperforming gear. So the LaserWriter joined the Newton and other products in being discontinued.
In any case, on last week’s episode of The Tech Night Owl LIVE, we presented outspoken commentator Jeff Gamet, Managing Editor for The Mac Observer, who briefly talked about the Slenderman urban legend, which was featured on our other radio show, The Paracast, before jumping full tilt into technology. There was a detailed discussion about Apple’s decision to discontinue AirPort routers, and why, after pioneering that business, it decided to give it all up. What about reports that the HomePod smart speaker system isn’t selling so well? What about a thought piece. so to speak, in Macworld about products Apple ought to give up? Gene and Jeff pointed out that one of the items on the list, the Mac mini, continues to get the love from Apple with positive statements from such executives as Tim Cook and Philip Schiller. The state of iTunes for Mac and Windows was discussed, plus the possibility that Apple might move the Mac platform to its customized ARM-based processors, or is there yet another option?
In a special encore presentation, you also heard from columnist Rob Pegoraro, who writes for USA Today, Yahoo Finance, Wirecutter and other publications. He discussed in detail his trip to Cape Canaveral to witness the launch of the SpaceX Falcon Heavy launch vehicle, the most powerful rocket ship the company has developed so far. Rob also explained what happened when he got lost. He briefly talked about his expectations for Apple’s smart speaker, the HomePod before discussing unexpected privacy issues involving an activity-tracking social network known as Strava, and the downsides of publicly revealing the location of its users, especially if that location is a secret U.S. military base. The privacy of connected cars was also discussed, particularly concerns about all that driving data a car collects, which can be used by insurance company, with a plugin receiver, to track your driving record. Gene and Rob also discussed whether car makers should make it easy for you to erase your data when you trade in the vehicle or it’s totaled.
On this week’s episode of our other radio show, The Paracast: Gene is joined by guest cohost Michael Allen in welcoming prolific paranormal author Nick Redfern back to The Paracast. Nick discusses the book, The Slenderman Mysteries: An Internet Urban Legend Comes to Life.Is it possible to invent a myth online, and have it emerge with frightening reality? Indeed, The Slenderman may be a tulpa, a thought-form that can stride out of our darkest imaginations and into reality if enough people believe in it. Nick Redfern is the author of 40 books, including Immortality of the Gods, Weapons of the Gods, Bloodline of the Gods, Monster Files, Memoirs of a Monster Hunter, The Real Men in Black, The NASA Conspiracies, Keep Out!, The Pyramids and the Pentagon, Contactees, The World’s Weirdest Places, For Nobody’s Eyes Only, and Close Encounters of the Fatal Kind.
WHAT IF A THIRD PARTY INK CARTRIDGE DAMAGES YOUR PRINTER?
It’s well-known that printer makers earn most of their profits from the consumables, not the purchase of the original product. Indeed, during a normal lifecycle, you’ll pay the hardware’s price over and over again to keep it going. But there have been efforts to reduce the cost of consumables, such as Epson’s Eco-Tank printers, although your upfront price is far higher in exchange for cheaper ink.
Some suggest that printer ink can cost more than an ounce of gold, but that might be pushing it. But consider just one example of overpriced ink. So the usual going rate for an OEM, or factory-built ink cartridge for a printer may be over $30, if you buy the “extra capacity” version. For my all-in-one, Epson’s Workforce WF-3640 printer, which has been out of production for a while, the 252XL cartridge is $34.99 at most mainstream dealers, such as Staples. Add a similar amount for each of the remaining three colors.
Now most of my printing is handled by a cheap Brother laser. From the day that the original factory toner cartridge was spent, I bought remanufactured cartridges. I am guided by the combination of high ratings and a low price at Amazon in choosing what to buy. Of late, I’ve used the INK4WORK brand, which costs $14.98 for its replacement for Brother’s TN-850 High Yield Toner Cartridge. Brother’s version is $106.99 after discount.
You can see where I’m going.
Well, print quality is almost identical to the OEM version, except for a slight streak every so often. There is no evidence whatever that the printer has suffered any, so I’m happy to continue to use it.
However, I didn’t do near as well with the WF-3640. I only print color occasionally. The cost of color is not worth it, but the original high-capacity black cartridge finally ran out. I found an LD Products replacement for $9.99 and bought one last week.
I didn’t expect trouble, since LD is supposed to be a pretty reliable brand. But sometimes reality doesn’t match one’s expectations.
In doing some online research, I ran across this comment in a Consumer Reports article on using third-party printer cartridges. I should have paid closer attention to this phrase: “some aftermarket inks worked initially but quickly clogged printer heads.”
So after I installed the black cartridge replacement from LD Products, the printer went through its long setup cycle before outputting the first page. I printed a web page consisting mostly of text with a single color illustration, but I didn’t expect what emerged from the printer’s output tray.
While the color print was mostly good, the black text was inconsistent, barely readable with lots of faded copy.
Before printing another document, I ran a test print to check the condition of the print heads. While three of the four colors were fine, the black parallel lines on the page had huge gaps in them.
I ran the printer’s head cleaning function, which uses quite a bit of ink during the cycle. After one cleaning cycle, the black lines weren’t so bad, but there were still gaps in them. The other colors remained solid, so I concentrated on the black only option as I ran the printer through another four cycles, running a test print between each. There was no further improvement.
Now one serious downside of using a remanufactured or third-party cartridge is that the manufacturer might void your warranty and/or refuse to repair a damaged unit if there’s evidence you used something other than OEM ink. I didn’t take this seriously before, because the only problems I’ve seen over the years with other inkjets were inconsistent print quality, If I went back to OEM, it was just fine.
I contacted Epson via online chat to see if they had any advice to clear the print heads, aside from taking the unit in for repair. Since the warranty had expired, getting free service would have been out of the question in any case.
The tech took me through the standard process of multiple cleaning operations, and removing and replacing the cartridge. There was no further improvement, so I was advised to take the printer to an authorized repair center to address what they concluded was a hardware-related problem. The print head assembly is not user replaceable as it is on other inkjets.
While there are third-party products that promise to clear clogged print heads on an Epson printer, they are not guaranteed to work. One product sold by Amazon had this cautionary note, “Printer cleaning is successful 95% of the time, but does require a supply of fresh ink and carries a small risk of damage to the printer.”
If it doesn’t work, you’ll get a refund, with no guarantee of repair or replacement of a broken printer.
So before considering whether to invest money I didn’t have in fixing an older printer, I contacted Amazon, whose only solution was to either replace the cartridge or give me a refund. I choose the latter, but opted to contact LD Products anyway in the hope that they might offer something more substantial because of what happened to my printer.
I didn’t mention that Amazon refunded my money; it was about their solution. The best they’d offer was to send a replacement cartridge, and I’ll grant the one I bought might be defective. I asked the tech what they’d do in the event the replacement doesn’t help, and the solution was to send another cartridge.
When I asked whether they’d pay for repair or replacement, I was informed that, if the repair shop agreed that the printer was probably damaged by the cartridge, they’d agree to pay part — but not all — of the repair cost.
Depending on how much the repair costs, it might be worth it. But how do you prove any specific ink cartridge damaged the printer, other than to leave it in the unit to demonstrate it had been used? Even then, the causal factor can only be inferred.
As I wait for the replacement cartridge to arrive, I’m frankly disappointed at the turn of events. Switching to OEM is a non-starter even if the printer is ultimately repaired. Amazon’s price for the set of four Epson 252XL high capacity cartridges is $140, with an estimated yield of 1,100 copies. Compare that to what I get from that remanufactured Brother laser toner cartridge, between 4,000 and 5,000 copies from an investment of just $14.98.
The most recent World Health Organization rankings of the world’s health systems has the United States at 37th -- seven spots behind its neighbor to the north, Canada, and 19 spots behind its American predecessor, the United Kingdom. That might not seem so bad on a list 190 nations long, but the United States ranks last in health care system performance among the 11 richest countries included in a study conducted by The Commonwealth Fund. In that study, “the U.S. ranks last in Access, Equity, and Health Care Outcomes, and next to last in Administrative Efficiency, as reported by patients and providers.”
Much of our inflated health insurance premiums in America comes from paying to create your bill. That’s right -- 25 percent of total U.S. hospital costs are administrative costs. The United States had the highest administrative costs of the eight countries studied by The Commonwealth Fund. Scotland and Canada had the lowest, and reducing U.S. per capita spending for hospital administration to Scottish or Canadian levels would have saved more than $150 billion in 2011.
Treating healthcare like any other marketplace requires careful, complicated codification of products sold and services rendered. People must be paid to determine how much your healthcare costs, and that can’t be changed, but it can be improved upon. Allowing insurance companies to profit from people’s health makes for a marketplace in which every cent of cost is counted and every penny of profit is protected. Profit motive always results in more scrutiny by the haves at the expense of the have-nots.
You might think that an industry that preys on the unhealthy and the healthy alike would prefer their consumers healthy as to enjoy the profits from your premium payments without paying for healthcare. But the cost of your health insurance premium already includes your health insurer’s profit margin. The health insurer is going to do all it can assure a certain amount a profit except for a catastrophic health emergency that consumes the country. But if the consuming population is unhealthy relative to other markets, the health insurer has good reason to inflate prices to cover its projected costs. That is indeed the case in the United States.
The United States is the 34th healthiest nation in the world, according to 24/7 Wall St. That’s not terrible, but not what you probably expect from a nation advertised by Americans as the greatest in the world. And you’re paying for it.
Not unlike a mortgage or auto insurance premium, the cost of your health insurance premium is an average based on the health insurer’s risk. That risk is the potential costs the health insurer could incur based on the perceived health of its insured consumers. I’ve written in the past how Republicans can’t repeal and replace Obamacare because their constituents, most of whom reside in the South, need Obamacare. Southerners are the least healthy Americans, with 20 percent reporting fair or poor health in 2014. The South also has the highest rates for diabetes, obesity and infant mortality in the nation. The South also accounts for nearly as many uninsured people as the rest of America combined, and 17 percent of the uninsured fall into the coverage gap for Medicaid expansion. Your health insurance premiums pay for their healthcare as well as your own, which is why, given the current for-profit health insurance marketplace, I would welcome a fat tax.
A fat tax is a tax on fat people. People who live unhealthy lifestyles should pay more for health insurance. As a healthy consumer of health insurance, I’d prefer to pay a lower premium given my dedication to maintaining good health at the expense of those who refuse to maintain good health. I might be fat shaming some people, but I don’t care. I shouldn’t have to pay for your diabetes because you can’t resist stuffing your face with Twinkies. Maintaining your health is your responsibility and no one else’s, and you should be punished for failing to maintain good health at the expense of your neighbors. But since something that could ever be referred to as a fat tax by the opposition would never pass Congress, a rewarding people with discounts for their healthy habits would be much more likely.
I foresee this program as mirroring the Progressive auto insurance Snapshot program -- “a program that personalizes your rate based on your ACTUAL driving.” Instead of plugging a device into y