Industry sponsors:
Home | Notebooks | Tablets | Handhelds | Embedded | Panels | Definitions | Leaders | About us
RuggedPCReview Industry Sponsors:
Cincoze | Durabook Americas | DT Research | Emdoor | Getac Technology | Handheld Group | Janam Technologies
Juniper Systems | MobileDemand | RuggON | Trimble | Teguar Computers | Winmate | Zebra

June 6, 2024

What's your laptop or tablet's gas mileage?

How long does a charge of the battery in your phone, tablet or laptop last? How long is it supposed to last? If the tech specs do include battery life, can you trust that number? How was it measured? Can you rely on it? How much does battery life matter? Or is it just something that should last long enough?

When Steve Jobs introduced the Apple iPad in 2010, he said that such a tablet should run ten hours on a job, and that sort of became what people expect from a tablet. With mobile computers used on the job, the industry more or less agreed that a device should last a full shift, generally eight hours. But, again, how is that measured? Are there any standards?

Why isn't there a "Miles-per-Gallon" for mobile computers?

Battery life of a device is generally treated like an afterthought in its specifications. It's usually listed as "approximately" so and so long, or "up to" so many hours. A tiny asterisk may refer to additional information in the fine print at the bottom of the specs. That just doesn't seem like enough.
Since battery life definitely matters, why can't it be tested and listed more like gas mileage on a car or truck, so you know how many miles per gallon you can expect (or liters per 100 km in other parts of the world)? Those measures, mandated by the government, may not be perfect, but they do provide an idea of how efficient a vehicle is, and how often you have to fill up. If you know how much fuel goes into your tank and how many miles per gallon your vehicle gets, that gives you a pretty good idea how far you can go and whether that's good enough for your driving habits and patterns.

That applies to gasoline-powered vehicles. With electric cars, it's a bit more difficult. There you have to go by how many miles the manufacturer or the government says you're going to get from a full charge. You may know what the capacity of your battery is or how much charge you have left, but there isn't a universally accepted measure of efficiency, like the miles per gallon. Plus, you have the extra issues of how fast a charger charges the car, and there really isn't an equivalent to the city, highway, and mixed mileage numbers you get for a gas-powered vehicle. And the maximum capacity of a battery tends to diminish over time. Car magazines have started efforts to include miles per gallon equivalents for electric cars that shows a vehicle's efficiency, like miles per kilo-watt-hour, but it's a work in progress.

Computing electric “mileage”

How does all of the above apply to mobile computers? They have a battery, or batteries, with so and so many watt-hours capacity. What does that mean? Well, watt-hours is a unit of electrical energy stored in a battery. Watt is computed as amps x volts. Amps is the rate at which current flows through a circuit. Volts is sort of like the water pressure in a hose.
An LED bulb may draw four watts, so to have that bulb run for one hour takes four watt-hours. If that LED bulb were in a flashlight powered by three AA batteries, how long would the flashlight work on a set? A Double-A battery may be listed as 1.5 Volts and 2800 mAH. Watt = Volt x Amps, so that'd be 1.5V x 2.8Amps = 4.2 watt-hours, or 12.6 watt-hours for all three batteries. So that flashlight would burn for about three hours until the batteries are spent.

How can we use those physical facts to figure out how long a charge in a mobile computer lasts? Well, there are a number of ways. We can let the computer run and use an app that measures how many watts it uses to run. Say, the computer draws six watts and the battery is rated 60 watt-hours, divide 60 watt-hours by 6 watts = 10 hours. So that system would run 10 hours on a charge. And we can conclude that the system uses about six watt-hours for each hour of operation.

But, just like it is with gas-powered vehicles where gas mileage varies depending on the type of driving. A vehicle may be EPA-rated at 30 miles per highway gallon, but it's less in city driving, and if you pulled a trailer up a mountain, it might be closer to 10 miles per gallon. It's the same with electricity-powered computers. If the laptop or tablet just idles, it'll use much less energy than when it's loaded down with complex tasks. There is just way around that.

The way it’s currently done

But could there not be a standardized way of measuring electrical "mileage"? With gas-powered vehicles, the EPA clearly describes the type of driving used to determine city and highway MPG testing. The same could be done for computers. There could be a carefully designed set of tasks that the system would run, and the test would then measure how much energy is consumed to complete the test. The results would then give users an idea of how much battery "life" they can expect.
That is actually sort of how the computer industry does it today. Except that there really are no standards. One company may use one piece of benchmarking software and the next one another. And stating how many hours the battery will last really only goes for whatever battery is in the system. Since computers' batteries often come with different storage capacities, the "battery life" number is only valid for the battery with which the test was done. And whether or not the result is meaningful depends entirely on how much load the testing utility puts on the system.

The result is that the battery life you'll find in mobile computer specs is not as helpful as it should be. Which is unfortunate, because battery life on a charge matters. Isn't there a better way? In cars, mileage will vary, but at least all manufacturers must use the exact same EPA tests, so the results are somewhat comparable and meaningful. That is not currently the case with battery-powered computers.

Is there a better way?

But what if there were a way? Everyone could, for example, agree to use the exact same testing software that puts a carefully designed mixed load on the system during the testing. It would also fix other parameters, like, for example, how bright the display must be during the testing. A very bright backlight draws a lot of power, so brightness should be set to a reasonable level, say 200 nits -- perfect for indoor use -- for testing.
Then the testing would be conducted, and the utility would measure, knowing the rated Watt-Hour capacity of the battery used, how long it would take to draw battery charge from, say 90% down to 10%. Once the test is done, you know, how many watt-hours the system draws per hour.

Like miles per gallon on your vehicle, you'd know watt-hours per hour on your computer. If the laptop draws 10 watt-hours per hour and the battery has 80 watt-hours capacity, then a full charge of the battery will get you eight hours of continuing operation. You could then also easily calculate how long a smaller or a bigger battery would last. And you would know how power-efficient one computer is compared to another.

Wouldn't that be a very good thing?

What we've been doing

And that's what we've now started doing at RuggedPCReview.com. We use UL Solutions' PCMark 10 Battery test in "Modern Office" mode that continuously runs typical work activities such as writing, web browsing, video conferencing, and similar. This allows us to report the following information in our review and testing results:

“Battery life” -- how many hours the device ran on a full charge in the UL Solutions in PCMark 10 Battery test in Modern Office Mode. That provides a good idea of how long the machine will last, on the battery that was in the test machine, continuously doing typical tasks. Most likely it will last even longer, because almost no one in the field will use a computer continuously for many hours without it ever idling or going to sleep.

Minutes per watt-hour -- that's the closest in electronics to "miles per gallon" in a vehicle. The gas tank holds that and that many gallons, and the vehicle gets so and so many miles for each gallon. The computer's battery holds that and that many watt-hours of electricity, and the computer runs so and so many minutes for each watt-hour. That's good to know.

Watt-hours per hour -- this is the electricity equivalent of the "liters per 100 kilometers" many European nations use instead of miles-per-gallon. Only, with electricity it shows how many of those watt-hours in the battery is the computer using up every hour of operation.

Why this should happen

Is such standardized reporting on energy consumption going to happen in the mobile computer industry? It really should in some shape or form. Knowing how efficient your computer is in doing its work should matter. As is, few people know how much running and charging their computers cost in terms of electricity.
But electricity is part of the cost of running computers. It's like paying for gas for cars or paying for utilities at a home or business. An inefficient computer doesn't hurt as much as filling up a gas guzzler at the gas station does. But if you run a company and have hundreds or thousands of computers, it adds up. The savings of using efficient computers could be substantial.

And how efficiently your computer runs matters in other ways as well. Who wants a gas guzzler computer? Who doesn't hate battery "range anxiety" when work needs to be done and battery power quickly drains away? Who is not at least a little interested in knowing how long a computer really runs on that battery, and how much electricity it consumes?

Real life data and results

o see how much power a variety of mobile computers from different manufacturers use, examine the data in the table below. It's from our RuggedPCReview.com testing lab, and shows how long rugged laptops and rugged tablets we tested lasted on a charge of their battery or batteries.
The Hours column shows the precise time each system lasted running the PCMark 10 Modern Office battery draw down test before it was out of charge.

We highlighted the Watt-Hours per Hour and Minutes per Watt-Hour columns in green because they show, in two different ways, the absolute power use of each system. These numbers are each system's actual "gas mileage." The Battery Watt-Hour column shows the storage capacity of the battery or batteries used in the test.


Posted by conradb212 at 4:04 PM

March 15, 2024

No more AdSense

Ads.... I get it. With TV, having to watch commercials was the cost of getting free content. Until streaming came and pretty much replaced cable TV and its double-whammy of commercials AND charging more and more for the service. When Netflix started streaming, you got a lot of content without commercials for a very modest monthly fee. Now, there are dozens of major streaming providers and they all want a steadily increasing monthly fee.

Print news and print magazines are pretty much dead. And they still haven't figured out how to charge for content. Hiding everything behind paywalls ticks people off. And let's not even get into those hideous "subscriptions" for every app and every site. Who thought trying to trick people into subscribing for every little app was a good idea? Who needs hundreds of unmanageable monthly payments that often are almost impossible to cancel? What a nightmare. But you can't expect content to be free, and so we've come to grudgingly accept that nothing is ever truly free.

Banner ads and such initially seemed an acceptable way to go. Unless they take over and become simply too obnoxious. Until there's more click-bait than actual content. And so on. It's a real disaster for all sides.

I thought about this long and hard when we launched RuggedPCReview.com almost two decades ago. Even back then, the handwriting was on the wall. Creating content costs time and money, and there had to be a reasonable, optimal way to make it all work out for all involved.

The solution we came up with back then was use a technology sponsorship model where companies that felt our content was worthwhile and needed, and with which they wanted to be associated with and seen with because it could generate business and impact purchasing decisions. Technology sponsors would get banner ads on our front page, and also on the pages that dealt with their products.

However, since it took time to build a sponsorship base that covered our costs, we also set up Google’s AdSense in some strategic locations. Google actually had called me personally to get on our sites. So we designated two small, tightly controlled ad zones in many of our page layouts where Google was allowed to advertise with advertising that we approved.

That initially worked well and for a while contributed a little (but never much) to our bottom line. But as the years went by, Google got greedier and the whole world began packing their websites with Google ads and ads and links and popups and what-all by Google competitors. That soon led to the click-bait deluge that’s been getting worse and worse and worse. And with it, online ads paid less and less and less until it was just a tiny fraction of what it once was, despite way more traffic on our sites.

Then began the practice of advertisers to post ads any size they felt like instead of staying within their designated zones, making web pages look ugly, bizarre, and almost unreadable. That, too, got worse and worse and worse, until it was totally unacceptable. Like it was when cable TV shows and movies were more commercials than content. Enough is enough. We simply didn't want to see our carefully crafted pages clobbered and mutilated by out-of-control Google ads.

So we finally pulled the plug on all remaining Google AdSense ads. A sad ending of an approach that began as an effort to advertise responsibly and to the benefit of both sides. As a result, we now have many hundreds of pages with blanks where the (initially well-behaved) Google ads used to be. We'll eventually find a use for that space. Maybe we'll use some to advertise for ourselves, to get new companies to sponsor our work.

As for Google.... please don't be evil, you once said you were not going to be. You initially did such good work with your terrific search engine. It really didn't have to end the way it is now. Too many ads everywhere, search results that are almost all ads and can not longer be trusted. And now Google has ChatGPT to contend with, the AI that does respond with just what you ask for. Of course, AI may soon also be turned into just another ad delivery system, and even worse one.

And so the quest for the best way of being compensated for the creation of quality content, one that works for the creators and the advertisers and the consumers, continues.

Posted by conradb212 at 7:33 PM

November 13, 2023

The Thunderbolt disaster

Back in September 2023, Intel very quietly introduced Thunderbolt 5. Intel had shown demos of it even in late 2022, but it won't be until sometime in 2024 before we'll actually start seeing Thunderbolt 5. And even then, gamers will likely get it first, because speed and bandwidth is what Thunderbolt excels at, and that's what gamers need. We're talking bi-directional transfer speed of up to 80Gbps, which is VERY fast.

Then there's resolution. It's been nearly a decade since 4k TVs passed old 1080p "full HD" models, and it's only a matter of time until 8k is here, at least with TVs. PC monitors are lagging behind a bit, with just Apple and LG using 5k screens and monitors, and laptop screens are farther behind yet. But 8k screens will eventually arrive and they will need high speed connection. Thunderbolt 5 will be ready for that. But other than that, what's the case for Thunderbolt 5?

Judging by how Thunderbolt 4 made out, not much. Thunderbolt started as a joint effort between Intel and Apple, combining the PCI Express and DisplayPort signals. Thunderbolt 1 and 2 had special connectors and were primarily used on Apple products. Thunderbolt 3 switched to the reversible USB-C connector, the same as, well, USB uses.

But it wasn't until Thunderbolt 4 appeared in 2020 that the world took notice, sort of. And that was mostly because Intel (stealth-)marketed Thunderbolt 4 as a "one wire solution." The sales pitch was that you could just plug a laptop or tablet into a Thunderbolt 4 dock, and, voila, duel external screens, all the peripherals and also all the charging you needed were right there, including that of the laptop or tablet itself. All of that with just one wire between the laptop and the TB4 dock. Some of our readers were very excited about that.

Except that, as we soon found out, it didn't always work. At RuggedPCReview.com, we tested a good number of high-end new rugged devices that supported Thunderbolt 4 and found that things are far from as simple as they should be.

The problem was that Thunderbolt 4 relies on a good deal of software, configuration and driver footwork. And on just the exact right cables. Far from being a universal plug-and-play standard, the Thunderbolt 4 controllers in computers and docks must first establish "handshakes" between them, figuring out what is and what it not supported, and to what degree. In our testing, almost every machine had one issue or another with different Thunderbolt 4 docks. And since Thunderbolt 4 also enlists the CPU in helping out, we saw overheating in some machines and sometimes substantial drops in performance.

As a result, there was lots of confusion about TB4/USB-C ports, what you could plug in and what they actually delivered. And TB4 was and is super-finicky about cables. And there is now the mounting problem of figuring out which USB port is a Thunderbolt 4 port (and what, exactly that is and means), and which is just a "regular" USB port and, if that, whether it can or cannot provide charging and how much.

Charging, too, was a problem, and often still remains a problem. Laptops, for example, may require so and so many watts to charge, and if a TB4 dock can't deliver that, there will be no charging. Sometimes even if the TB4 dock does meet the wattage requirement. That's because the laptop and the dock may not be able to negotiate a "power contract." And without that, no charging. So you never know if a dock or charger will actually work with TB4. That's very different from USB charging where, for the most part, you can charge anything with any USB charger.

Without that universal charging, what sense does it even make to have TB4 charging? Out there in the field, you want to be sure that your gear can be charged with ANY USB charger.

After lots of testing, we approached the TB4 folks at Intel directly. And got nowhere with our questions. That's not the way it's supposed to be.

It's over two years later now, December 2023. TB4 ports are on most new higher-end Windows laptops and tablets, together with regular USB-C ports. They still have all sorts of icons on them, incomprehensible to most, and sometimes they work and sometimes they don't. That, too, is not how it's supposed to be. And even if everything worked, Thunderbolt docks cost A LOT more than standard USB docks. It all kind of adds up to wondering how anyone ever thought this was a good idea.

Posted by conradb212 at 4:27 PM

April 27, 2023

Should we add pricing and "Best of"?

It's in the nature of tech sites such as RuggedPCReview.com to periodically review the operation and see if it's perhaps time to expand sections, add new things, update or retire this and that, and so on. It's then also time to review the tools and utilities used to manage the site, or perhaps switch to an entirely different system altogether.

As far as the latter goes, every three years or so we've contemplated switching from our "hand-coded" approach that goes back to the very beginning of RuggedPCReview.com almost two decades ago, to a more integrated website management system. That usually means WordPress, which is by far the leader in site creation and management. And each time the answer was a definite "no way." WordPress, of course, is powerful and has massive third-party support. It's a great system for many types of websites, but ours isn't one of them. We'll review that decision again, but I doubt that we'll come to a different conclusion.

But there are other things to consider.

One of them is if we should add price to the comprehensive spec sheet at the end of every one of our product reviews. That sounds like an obvious thing to do, but it really isn't. Pricing is so very relative. There's MSRP, the manufacturer's suggested retail price. There's "street" price. There are quantity discounts. And price always depends on configurations and options. Manufacturers often list a "starting at" price, and we've occasionally done that as well. But with computers, and especially rugged ones with all of their possible deployments and applications, "nicely equipped" can cost twice as much or more than the "starting at" price. Should we simply inquire with the manufacturer and see what they would like for us to list? Perhaps, and we've done that. Or should we just stay with "inquire"?

There's also the fact that, as fairly specialized and usually relatively low-volume products, rugged computing systems are not inexpensive. In an industry that's dealing with customers often contemplating purchasing much lower priced consumer electronics in a protective case, adding to "sticker shock" before prospective customers even consider total cost of ownership of rugged systems is not what we're after.

So much for the price issue. There's something else we've considered off and on. Publishing period "Best of" listings and awards. Everyone does that these days. Google "Best of xyz" and there's any number of web pages listing whatever they consider best. The fact alone that those lists almost always have Amazon buying buttons next to the products, i.e. the site gets a commission, will make you wonder about the legitimacy of such "Best of" lists.

Even relatively legitimate publications are doing those "Best of" lists and awards, and many of them are, well, questionable at best. We're seeing once legitimate and respected sites and publications now do "Best of" lists and awards where you truly have to wonder how they came to their conclusions. Cheap, basic white-box products that aren't even in the same class beating legitimate, well-established market leaders? Yup, seen it. Merrily mixing products that aren't even competing in the same category? Yup. Watering awards systems down so much that literally everyone gets some award? Yes. Hey, if you're the only semi-enterprise class 11.475-inch tablet available in ocean-blue and with two bumper options, you're the best of that category and deserve an award. Because who does not want to be "award-winning"?

Questions, questions.

Posted by conradb212 at 5:36 PM

October 3, 2022

Was Intel 11th generation "Tiger Lake" the milestone we thought it was?

In the wonderful world of technology, there are few things where progress is as mind-blowingly fast as in electronics. Today you can get an iPhone with 100,000 times the storage capacity of the hard disk in an early IBM PC. And the clockspeed of the CPU in that early PC was a thousand times slower than that in a modern PC. That incredible pace of technological progress has revolutionized the world and our lives, opened up new opportunity and made things possible that weren't even dreamed of just a few decades ago. But that progress also means greatly accelerated obsolescence of the computing products we're using.

That electronic obsolescence would't be so bad if the software on our computers remained more or less static, but that's never the case. Every increase in computing power and storage capacity is quickly soaked up by more complex and more demanding software. Loaded with current software, a state of the art PC of five years ago is slow, and one built ten years ago is barely able to boot.

While the rapid advance is good news for many consumers who, even at a cost, love to have the latest and greatest and don't mind getting new gear every year or two, it's bad news for commercial, industrial and government customers that count on significantly longer life cycles. For them, rapid obsolescence either means extra cost to stay up to date, or fall behind, sometimes hopelessly so.

I wrote about that predicament some time ago in a post called "Intel generations" -- how the rapid succession of Intel Core processor families has made obsolescence an ongoing, costly problem.

And one that isn't going away anytime soon. That's why for manufacturers and consumers alike it's always great when a technology comes along that won't quickly be obsolete or replaced by something different. But that doesn't happen often, and so it's good to at least have milestones -- things that, while not representing the be-all and end-all, at least are here to stay for a while, giving customers a temporary reprieve from rapid obsolescence.

With Intel processors, that's a "generation" that's particularly good, particularly solid, and unlikely to be rendered obsolete at least for a bit longer. Instead of a year, maybe three years or four.

Intel's 6th generation "Skylake" was such an example. Skylake was both the last generation of Core processors that still supported earlier versions of Microsoft Windows, as well as having a microarchitecture that remained in use until the 11th generation. 8th generation "Coffee Lake" was another such milestone, being the first that brought quad-core processor to mobile computers -- a big step forward.

But it was the "Tiger Lake" 11th generation that (so far) trumped them all with the first new microarchitecture since Skylake, the long awaited switch to 10nm process technology, scalable thermal design power, Intel Iris Xe integrated graphics, Thunderbolt 4 support, and more. What made Tiger Lake so special? Well, for the first time ever, Intel allowed manufacturers to "tune" processors to optimally match their hardware as well as the requirements of their target customers.

Whereas prior to the 11th generation, mobile Core processors were delivered with a set default TDP -- Thermal Design Power -- and could only be tweaked via the Power Plans in the OS, "Tiger Lake" allowed OEMs to create power plans with lower or higher TDP with Intel’s Dynamic Tuning Technology. No longer was the thermal envelope of a processor a fixed given. It was now possible to match the processor's behavior to device design and customers' typical work flows.

Most manufacturers of rugged mobile computers took advantage of that. Here at RuggedPCReview.com we began, for example, seeing a significant difference in device benchmark performance when plugged in (battery life not an issue, emphasis on maximum performance) versus when running on battery (optimized battery life and sustained performance over a wider ambient temperature range). Likewise, device design decisions such as whether to use a fan or rely on less effective passive cooling could now be matched and optimized by taking that into consideration.

As a result, virtually all of the leading providers of rugged laptops and tablet computers switched to Tiger Lake as upgrades or in entirely new designs, leading to unprecedented levels of both performance and economy. A true milestone had been reached indeed.

But not all was well. For whatever reasons, the new technologies baked into Intel's 11th gen Tiger Lake chips seemed either considered too classified and proprietary to reveal, or too complicated to be properly implemented. Even the hard-core tech media seemed mostly at a loss. Our extensive benchmarking showed that tweaking and optimizing seemed to have taken place, but not always successfully, and most not communicated.

A second technology integrated into Tiger Lake fared even worse. "Thunderbolt" had been a joint efforts between Intel and Apple to come up with a faster and more powerful data transfer interface. In time Thunderbolt evolved to use the popular reversible USB Type-C connector, combining all the goodness of the PCIe and DisplayPort interfaces, and also supporting the super-fast USB 4 with upstream and downstream power delivery capability.

In theory that meant that a true "one wire" solution became possible for those who brought their mobile computer into an office. Tiger Lake machines would just need one USB 4 cable and a Thunderbolt 4 dock to connect to keyboards, mice, external drives, two external 4k displays as well as charging via Thunderbolt 4, eliminating the need for a bulky power brick.

In practice, it sometimes worked and sometimes didn't. Charging, especially, relied on carefully programmed "power contracts" that seemed hard to implement and get to work. Sometimes it worked, more often it didn't.

Then there was the general confusion about which USB port was which and could do what and supported which USB standards. End result: Thunderbolt 4's potential was mostly wasted. We contacted Intel both with 11th gen power mode and Thunderbolt 4 questions, but, after multiple reminders, got nothing more than unhelpful boilerplate responses. And let's not even get into some Iris Xe graphics issues.

So was Intel's "Tiger Lake" 11th generation the milestone it had seemed? For the most part yes, despite the issues I've discussed here. Yes, because most rugged mobile computing manufacturers lined up behind it and introduced excellent new 11th gen-based products that were faster and more economical than ever before.

It's still possible that power modes and Thunderbolt 4 become better understood and better implemented and explained, but time is against it.

That's because rather than building on the inherent goodness of Tiger Lake and optimizing it in the next two or three generations as Intel has done in the past, the future looks different.

Starting with the "Alder Lake" 12th generation of Intel Core processors, there are "P-cores" and "E-cores" -- power and economy cores, just like in the ARM processors that power most of the world's smartphones. How that will change the game we don't know yet. As of this writing (October 2022) there are only very few Alder Lake-based rugged systems and we haven't had one in the lab yet.

But even though "Alder Lake" is barely on the market yet, Intel is already talking about 13th generation "Raptor Lake," available any day now, 14th generation "Meteor Lake" (late 2023), 15th generation Arrow Lake (2024) and 16th generation "Lunar Lake." All these will be hybrid chips that may include a third type of core, and with Lunar Lake Intel is shooting for "performance per watt leadership."

It'll be an interesting ride, folks, but one with very short life cycles.

Posted by conradb212 at 8:28 PM

April 19, 2022

If I were on the board of a rugged device provider...

If I were on the board of a rugged device provider, here's what I would ask them to consider.

Folks, I would say, there are truly and literally billions of smartphones out there. Almost everyone uses them every day, including on the job. Many are inexpensive and easily replaced. Almost all are put in a protective sleeve or case.

That is monumental competition for makers and vendors of dedicated rugged handhelds. To get a piece of the pie, to make the case that a customer should buy rugged devices instead of cheaper consumer devices, you have to outline why they should. Outline, describe, and prove.

You should point at all the advantages of rugged devices. And not just the devices themselves, but also why your expertise, your experience, the extra services you provide, the help you can offer, the connections you have that can help, all of that you must outline and present. The potential payoff is huge. As is, I see a lot of missed opportunity.

So here are a few things I’d like for you to think about:

Rugged-friendly design

As immensely popular as consumer smartphones are, they really are not as user-friendly as they could be. Example: the smartphone industry decided that it’s super-fashionable to have displays that take up the entire surface of the device, and often even wrap around the perimeter. The result is that you can barely touch such devices without triggering unwanted action.

Another example: they make them so sleek and slippery as to virtually guarantee that they slip out of one’s hand. And surfaces are so gleaming and glossy that they are certain to crack or scratch. Rugged handheld manufacturers must stay away from that. Yes, there is great temptation to make rugged handhelds look just as trendy as consumer smartphones, but it should not come at the cost of common sense.

Smartphone makers love flat, flush surfaces without margins around them and without any protective recess at all. And even many rugged device manufacturers make their devices much too slippery. Make them grippy, please. So that they feel secure in one’s hand, and so that you can lean them at something without them slipping and falling.

Make sure your customers know just how tough your product is!

I cringe every time I scan the specifications of a rugged device, and there’s just the barest minimum of ruggedness information. Isn’t ruggedness the very reason why customers pay extra for more robust design? Isn’t ruggedness what sets rugged devices apart from consumer smartphones? So why not explain, in detail, what the device is protected against? And how well protected it is? I shouldn’t even have to say that. It is self-evident. Simply adding statements like “MIL-STD compliant” to a spec sheet is wholly insufficient. So here’s what I’d like to see:

It's the MIL-STD-801H now!

For many long years, the MIL-STD-801G ruled. It was the definite document that described ruggedness testing procedures. It was far from perfect, because the DOD didn’t have rugged mobile device testing in mind when they created the standard. But that’s beside the point. The point is that MIL-STD-810G has been replaced with MIL-STD-810H. Yet, years after the new standard was introduced, the majority of ruggedness specs continue to refer to the old standard. Which may make some customers wonder just how serious the testing is. So read the pertaining sections of the new standard, test according to the new standard, and get certified by the new standard rules.

Add the crush spec

Years ago, when Olympus still made cameras (you can still buy “Olympus” cameras, but they are no longer made by the actual company), the company excelled with their tough and rugged adventure cameras. Olympus “Tough” cameras sported ruggedness properties that met and often exceeded those of rugged handhelds, including handling depths of up to 70 feet even with buttons, ports and a touch screen. Olympus went out of its way to highlight how tough their products were, explained what it all meant, and included pictures and videos.

And they included one spec that I don’t think I’ve ever seen in a rugged handheld, but that was part of almost all Olympus adventure camera specs -- the “crush spec.” How much pressure can a device handle before it gets crushed? Makes perfect sense. On the job it’s quite possible to step on a device. Sit on it, crush it between things. How much can it handle? Consider adding “crush resistance” to the specs.

Always include the tumble spec

The drop spec is good, but after most drops, devices tumble. Which is why a few rugged handheld makers include the tumble spec. It’s different from the static drop test in that it quantifies how many “tumbles” – slipping out of one’s hands while walking with the device – it can handle. That’s a good thing to know. And it should be included in every rugged handheld spec sheet.

Aim for IP68

IP67 is generally considered the gold standard for ingress protection in rugged mobile computers. No dust gets in, and the device can handle full immersion. Within reason, of course, and with IP67 that means no more than three feet of water and no longer than 30 minutes.

Problem is that there are now any number of consumer smartphones that claim IP68 protection. While IP68 isn’t terribly well defined even in the official standard (it essentially says continuous immersion but not how long and how deep), the general assumption of course is that IP68 is better than IP67. And it just doesn’t feel right that a fragile iPhone has an IP68 rating whereas most rugged handhelds max out at IP67. So I’d give that some thought and aim for a good, solid IP68 rating for most rugged handhelds.

Ruggedness information: Set an example

With all those vague, non-specific, all-encompassing “MIL-STD tested!” claims by rugged-wannabes, it is surprising how little specific, solid, detailed ruggedness information is provided by many genuine vendors of true rugged devices. Ruggedness testing is a specific, scientific discipline with easily describable results. The specs of each rugged device should have a comprehensive list of test results, in plain English. And that should be backed up with making detailed test results available to customers. A brief “Tested to MILD-STD requirements” simply is not enough.

Better cameras!

Almost every smartphone has at least two cameras built in these days – a front-facing one for video calls and a rear-facing one to take pictures – and many have three or four or even more. And that’s not even counting IR cameras, LiDAR and others. The cameras in almost all smartphones are very good and a good number are excellent.

Sadly, the majority of cameras in rugged handhelds aren’t very good, ranging from embarrassingly bad to sort of okay, with just a very few laudable exceptions. That is not acceptable. Cameras in rugged tools for the job should be just as good or better than what comes in a consumer phone. Users of rugged handhelds should be able to fully count on the cameras in their devices to get the job done, and done well. That is not the case now, and that must change. Shouldn’t professional users get professional gear, the best?

Keep Google contained

Sigh. Google owns Android, and Android owns the handheld and smartphone market for pretty much every device that’s not an iPhone. We’re talking monopoly here, and Google is taking advantage of that with an ever more heavy-handed presence in every Android device. During setup of an Android phone or computer users are practically forced to accept Google services, and Google products and services are pushed relentlessly. Compared to early Android devices, the latest Android hardware feels a bit like delivery vehicles for Google advertising and solicitation. That’s an unfortunate development, and providers of rugged handhelds should do whatever they can to minimize Google’s intrusions and activities on their devices.

But, you might say, isn’t there Android AOSP (“Android Open System Project”) that is free of Google’s choking presence? Yes, there is AOSP, or I should say there was. While AOSP still exists, Google has gone out of its way to make it so barren and unattractive that it feels like a penalty box for those who refuse to give Google free reign over their devices. None of the popular Google apps are available on AOSP, there is no access to the unfortunately named Google Play Store. AOSP users who want to download apps must rely on often shady third party Android app stores. And backup is disabled on AOSP. Yes, no backup. Some has changed as of late, but AOSP remains a sad place. Your customers deserve better. Find a solution!

Android updates…

Android’s rapid-fire version update policy has long been a source of frustration. That’s because unlike Apple or Microsoft OS software, Android OS updates may or may not be available for any given Android device. Have you ever wondered why so many rugged Android devices seem to run on old versions of Android? That’s because they can’t be upgraded. Customers often need to wait until a tech update of a device that comes with a newer version. The situation is so bad that vendor guarantees that a device will support the next two or three versions is considered an extra. It shouldn’t be that way. Yes, each new rev of an operating system is usually bigger and bulkier than the prior one, and thus makes hardware obsolete after a while. But not being able to upgrade at all? Unacceptable.

What IS that emphasis on enterprise?

Google likes to talk about Android for the enterprise, an effort to make Android devices more acceptable for use in enterprises. That means extra security and conforming to standard workplace practices. The problem is that it’s not terribly clear what exactly that means. Google’s Android Enterprise page says “The program offers APIs and other tools for developers to integrate support for Android into their enterprise mobility management (EMM) solutions.” All good buzz words, but what do they actually mean? I think rugged handheld providers should make every effort to spell out exactly for their customers what it means.

Add dedicated/demo apps

Among Google’s many bad habits with Android is the constant renaming and reshuffling of the user interface. From version to version everything is different, features are moved around, grouped differently, and so on. Sometimes just finding a necessary setting requires way too much time locating. Rugged handheld vendors should help their customers as much as they can by offering/creating demo apps, group important features, turn off Google’s often intrusive and self-serving defaults, and also create/package apps that truly add value to customers.

Custom cases or sleeves

The case of the case is a weird one. Consumer smartphones are, for truly no good reason, as slender, fragile and glitzy as possible, so much so that almost all users get a protective case that guards against scratching and breaking. The smartphone industry has delegated ruggedness to third party case vendors. A very weird situation.

That said, decent cases do protect, and sometimes amazingly well. So much so that there’s any number of YouTube videos of iPhones in cases shown to survive massive drops, again and again. Rugged handhelds have ruggedness built in. They don’t need a case. But not many claim a drop spec higher than four feet. Which is most peculiar, because when you use such a device as a phone and it drops while you’re making a call, it’ll fall from more than four feet. Five or six feet should definitely be standard in a rugged handheld. But that may require more protection, and that means a bigger, bulkier case.

So why not take a cue from those few very smart suppliers that offer custom protective sleeve that add extra protection for just a few dollars? Just in case a customer needs the extra protection.

Dare to be different

On my desk I have six handhelds. Apple and Android. Premium and economy priced. Rugged and non-rugged. They all look the same. Glossy black rectangles with rounded corners. There’s nothing inherently wrong with that, but why not dare to be different? Dare to feature functionality rather than make it blend in. Dare to have a brand identity. Sure, if a billion and a half phones are sold each year that all look like glossy black rectangles, it takes guts to be different, to not also make a glossy black rectangle. Maybe deviate just a little? Make some baby steps?

And that’s that. A few things to consider if you design, make, or distribute rugged handhelds. The complete global acceptance of handheld computers for virtually everything has opened vast new markets for rugged handhelds. A share of that can be yours, a potentially much larger one than ever before. But you have to differentiate yourself and emphasize your strengths and the compelling advantages of your products.

Make it so.


Posted by conradb212 at 8:57 PM