New “MailSploit” Allows Email Spoofing

Phishing attacks just got a whole lot easier.

A German security researcher named Sabri Haddouche has recently discovered a set of email vulnerabilities that have been collectively dubbed “Mailsploit.”  At the root, these vulnerabilities stem from the way most email systems interpret addresses encoded with a 1992 standard called RFC-1342.

The standard is that all information in an email header must be an ASCII character. If a non-ASCII character is encountered, it gets converted. Unfortunately, a shockingly large number of email clients (33 and counting) make no effort to check the header afterward for malicious code.

Also, if the RFC-1342 decoded header encountered a null-byte, or two or more email addresses, the only address that would be read would be the one that preceded the null-byte, or the first valid email address encountered.

The email clients vulnerable to this type of attack include:

  • Apple Mail
  • Mail for Windows 10
  • Microsoft Outlook 2016
  • Mozilla Thunderbird
  • Yahoo! Mail
  • AOL Mail

And many others, but Haddouche notes that Gmail is unaffected by the exploit.

There are two ways a hacker can use Mailsploit. First and most obvious to the eye is the fact that it can be used to spoof an email address, making it appear to be from someone you know, which, of course, has the impact of making it much more likely that you’ll click on any links embedded in the body of the message.

Secondly, and potentially even more troubling, is the fact that the exploit can be used to inject malicious code directly onto the recipient’s machine, which can easily give the hacker sending the email full control of the target’s system.

Worst of all, though, is the fact that while Haddouche contacted all of the companies found to offer vulnerable email clients, only eight of them have released a patch to correct the issue. Twelve vendors opted to triage the bug, but gave no information on if or when the issue might be patched, and twelve others made no reply at all.

Mozilla and Opera (both vulnerable) flatly refused to address the problem, which they see as a server-side issue.

Your IT staff’s job just got a whole lot harder.

Hard Drives May Double In Speed With New Technology

What’s an HDD manufacturer to do when faced with competition by faster, more efficient SSD drives?

Go big, and go faster. At least that’s the strategy that both Seagate and Western Digital are adopting.

SSDs tend to get prohibitively expensive as their size crosses the 1TB threshold, which creates an opportunity for HDD manufacturers. Seagate is currently selling drives with an impressive 14TB of capacity, and has plans on the drawing board to introduce a 40TB drive by 2023, with Western Digital not far behind, aiming for a 40TB drive by 2025.

That’s impressive, but as Seagate mentioned in a recent blog post:

“Capacity is only half of the solution. If the ability to rapidly access data doesn’t keep pace with all that capacity, the value potential of data is inhibited. Therefore, the advancement of digital storage requires both elements: increased capacity and increased performance.”

In order to address the performance side of the equation, Seagate is experimenting with a new approach called “multi-actuator technology.”

HDDS are based on platters, with an actuator arm on the top and bottom that write to the platters.

Actuators are all aligned and are designed to move in tandem, but at any given moment, only one arm is writing to the disk.

Seagate’s new solution utilizes two sets of actuator arms, each controlled independent of the other. With two heads capable of reading and writing simultaneously, HDD speeds can effectively be doubled.

It’s an idea that has been around for a while, but until recently, thanks to the prohibitive cost of the components, it was simply impractical. With component prices falling, it’s suddenly viable. The combination of massive HDDs and the new technology are making people take a second look at HDD technology.

This is a great advance that breathes new life into HDDs, and is a truly exciting innovation.

Performance Issues Plague PC’s Updated With Spectre Patch

Recently a critical flaw was found inside every Intel chip made during the last decade.  The flaw makes two different exploits possible.  These exploits have been dubbed “Meltdown” and “Spectre.”

The flaws are incredibly severe, and make it possible for a hacker to gain complete, unfettered access to the targeted PC or laptop.  Although no instances of the exploit have yet been found in the wild, now that both are commonly known, it’s only a matter of time before that happens.

Based on that, and given the severity of the flaw, Intel scrambled to release an update, but here’s the catch:  The update would hurt system performance, lowering it by as much as 23%.

In the end, it didn’t matter.  To ignore the problem was simply not an option, so the company scrambled to get a fix ready and has since released it.  Unfortunately, the fix has proved to be even more problematic than was originally estimated.  In addition to degrading machine performance, it also interferes with a variety of maintenance activities and leads to an inordinate number of system reboots.

Initially, Intel advised its customers to proceed with the download in order to protect their systems, even in light of the performance degradation.  However, as the number of complaints have grown, the company reversed course and has now advised against downloading its latest update, asking users to wait for a revision to be published.

At this point, the company has not given an ETA on when the revised firmware update will be ready, but until it is, you’re placed in an awkward position.  Waiting for the update means exposing your company to risk, should a hacker target one of the machines on your network with the exploit.  Proceeding with the current firmware update means you’ll suffer performance issues, leaving you stuck between a rock and a hard place, at least for the short term.

Microsoft Office Update Available To Only Windows 10 Users

There are big changes coming to MS Office which you need to be aware of, given how widely used “Office” is in most companies.

First, the headline change:  When MS Office 2019 is released, it will only run on Windows 10.  If you’ve still got machines on older operating systems, and you want to keep your productivity suite up to date, then you’ll need to upgrade those older systems.

Also, be aware that when Office 2019 ships, it will only have “Click-to-Run” technology.  No MSI, although Office Server will have an MSI deployment option.

In terms of software support, the company had this to say:

“Office 2019 will provide five years of mainstream support and approximately two years of extended support.  This is an exception to our ‘Fixed Lifecycle Policy’ to align with the support period for Office 2016.  Extended support will end 10/14/2025.”

The Office 2019 bundle will include the following apps:

  • Word
  • Excel
  • PowerPoint
  • Outlook
  • Skype for Business

Additionally, server versions of SharePoint and Exchange will be available.

In conjunction with the announcement above, the company also announced service extensions for Windows 10, and changes to the system requirements for people who use Office 365 ProPlus, the company’s online office suite.

Beginning on January 14, 2020, Office 365 ProPlus will no longer be supported on Windows 7, Windows 8.1, Windows Server 2016, or any Windows 10 LTSC (Long Term Servicing Channel) release.  Windows 10 support (versions 1511, 1607, 1703, and 1709) will get an additional six months of support for both enterprise and education customers.

Although these changes will no doubt inconvenience some users, overall, they have to be judged as a positive.  Microsoft has been taking a number of meaningful steps in recent years to streamline and simplify their product support, and these latest changes are very much in keeping with that.

Traditional Hard Drive Technology Is Evolving

Rumors of the death of HDD technology have been greatly exaggerated.  The advancement of solid state technology and its increasing rate of adoption has been largely responsible for this, but don’t count old school HDDs out just yet.  They still have many important advantages, and recent breakthroughs should add further to the longevity of the tech.

Right now, the biggest advantage that HDDs have over their solid-state counterparts is sheer size.  While it would be prohibitively expensive to purchase 20+ Terabytes of solid-state storage, getting that amount (or more) of HDD storage is a trivial undertaking, a fact that it’s impossible to discount.

Even more exciting though, consider the recent breakthrough in 3D nano-magnets.  These were invented at the University of Cambridge, and stand to completely change the game. They allow data to be stored and processed in three-dimensional space, which will not only increase HDD storage space exponentially, but should see similar gains in terms of speed of access.

Another exciting recent breakthrough is a new magnetic system that turns heat into motion, which could be used to power miniaturized IoT sensors and actuators.  Such a system could also be applied to HDD technology by using the heat to power lasers, which would write data using the heat from the system itself, leading to an incredible boost in operating efficiency.

Finally, consider the invention coming out of the Imperial College of London.  Researchers there have figured out a way to write magnetic patterns onto nano-wires, which the research team claims could mimic the function of the human brain.  While this technology is still in its infancy, imagine the possibilities of having a computer, or even parts of a computer (like your HDD) powered by something that mimics the function of the human brain, and the dazzling possibilities that open up.

All that to say, while HDD tech might be a little long in the tooth, it’s not dead yet.  Not by a longshot.

SSD Drive Makers Adding Features To Reduce Duplicate Data

Big changes are in the works in the SSD-based storage ecosystem. It includes three different vendors all making similar announcements regarding designs to help companies that rely on SSD-based storage systems to reduce duplication and control data creep.

It’s not hard to see why they’re scrambling.  Although the price of SSD-based storage systems are coming down, it’s a slow process.  Currently, a gigabyte’s worth of SSD storage costs about forty cents, versus about five cents per gigabyte of HDD storage.  Less data duplication means less data to store, making the SSD drives utilizing the new technology more efficient.

Here’s a quick overview of the solutions offered by the three major vendors in this space:

  • Hitachi – Hitachi is working to upgrade all-flash F-Series and its hybrid flash/hard disk G-Series of drives, as well as its SVOS operating system. The improvements to the operating system include new AI-based container and operations support and introduced a new feature in the form of the “Hitachi Infrastructure Analytics Advisor.” This provides real-time analysis of your data center optimization across all storage devices, networks, servers and virtual machines in a bit to more efficiently predict data center needs and optimize/troubleshoot data storage.
  • HPE – The company has upgraded their “Nimble” storage line, which includes an array of all-flash products, a hybrid disk-flash product line and a secondary flash line. The big change here is that the company’s products now support inline, variable block size deduplication.  The company claims this change makes their products “the most efficient hybrid arrays in the industry by a wide margin.”
  • IBM – IBM has issued an upgrade to its Storwize arrays, the first in more than two years. The update improves cloud integration, overall disk performance and an array of enhanced deduplication tools, claiming as much as a 5:1 data reduction while retaining 100 percent data availability (provided you’re using IBM HyperSwap).

How big an impact these changes will have remains to be seen, but kudos to all three companies for taking decisive steps to bolster the performance of their storage devices.

New Chips Support Increased Network Speeds To 400Gbps

Marvell Semiconductor has a new product out, and it’s a game changer.  Their new “Alaska” chip (the Alaska C 88×7120) is the first on the market to support the new 802.3 standard.  The 802.3cd is on tap to eventually replace current Ethernet ports running at 25Gbps to 100Gbps with ports that will run at 50Gbps, 200 Gbps, and 400 Gbps.

The future is now.

Granted, the Alaska chips aren’t for sale just yet, but they are sampling to customers (“Sampling” in the chip world is akin to beta testing in software).  The chip supports sixteen 50 Gbps ports, four 200 Gbps ports, and two 400 Gbps ports, which will quadruple network output.  Even better, the new chips support both copper and fiber-optic wiring, as well as SerDes (long-reach serialization/deserialization) on system and line side interfaces, allowing OEMs to use the chips for wide-area interfaces.

Also of interest, the new chips use PAM4 (pulse-amplitude modulation), which is a four-level signaling scheme that’s designed to replace NRZ (non-return to zero) binary modulation, and even better, the new PAM4 protocol will be backwards compatible with NRZ hardware.

The port density on the new chip has been optimized to enable both Quad Small Form Factor Pluggable – Double Density and Octal Small Form Factor Pluggable port types for 500 GbE, 200 GbE, and 400 GbE deployments.

If all of those technical details make your head spin, not to worry.  The short of it is that once these chips go mainstream, network output is going to increase dramatically, which means that network speeds are about to get even faster.

Unless you run or manage a huge data center, you’ll probably never have direct contact with these chips. However, as big data centers begin deploying them, you’ll absolutely see the benefits.

Study Shows Employee Satisfaction Is Higher With Technology Improvements

A new study recently published by HPE Aruba called “The Right Technologies Unlock The Potential Of The Digital Workplace,” reveals some interesting details about technology in the workplace that’s worth paying attention to.

The study was conducted by collecting feedback from more than seven thousand companies of various sizes around the globe.  These were broken broadly into two groups: “Digital Revolutionaries,” which made more and better use of cutting edge technology, and “Digital Laggards” which were slower to adopt the latest and greatest technologies.

The headline statistic is that 51 percent of employees working in companies employing more technology reported greater job satisfaction, and an impressive 72 percent of employees in these companies reported a greater ability to adopt new work-related skills.

Other intriguing statistics include:

  • 31 percent of respondents in the “Digital Laggard” category indicated that tech aided their professional development, compared with 65 percent in the “Digital Revolutionary” category
  • 92 percent of respondents said that more technology would improve the workplace overall
  • 69 percent of respondents indicated a desire to see fully automated equipment in more widespread use in the workplace

Joseph White, the Director of Workplace Strategy, Design and Management at Herman Miller said in a press release:

“No matter the industry, we’re seeing a move toward human-centric places as enterprises strive to meet rapidly changing expectations of how people want to work.  This depends upon combining advances in technology -which includes furnishings- with the cognitive sciences to help people engage with work in new ways.  This will not only mean singular, premium experiences for individuals, but also the opportunity for organizations to attract and retain the best talent.”

The study notes, however, that cyber security issues remain as challenging as ever.  Survey respondents reported lower than average cyber security awareness, which could lead to greater risks and exposure as workplaces become increasingly digitized.

While a small majority (52 percent) of respondents reported thinking about cybersecurity often (daily), fully a quarter have connected to unsecured WiFi and one in five reported using the same passwords across multiple web properties. These are the two most dangerous cybersecurity-related behaviors.

Clearly, increased technology has its risks.