Monday, February 08, 2016

CEVA targets IoT systems with comms reference design

By Nick Flaherty

DSP core designer CEVA has developed a reference platform to accelerate the design of low-data-rate machine-to-machine (M2M) and IoT communication applications, including smart grid, surveillance systems, asset tracking, remote monitoring systems, connected cars and smart utilities.

The Dragonfly  multifunction platform is enabled  by the recently announced CEVA-XC5 and CEVA-XC8 DSP cores and accompanied by the hardware and software components required to rapidly design machine-type communications (MTC) systems. The platform supports existing and emerging LTE MTC releases, LPWAN standards such as LoRa, SiGFox and Ingenu), as well as Wi-Fi, GPS or any other IoT-related communications standard set to be deployed for M2M communication.

Dragonfly offers system developers a flexible platform that allows for optimal hardware/software system partitioning, combining a low power vector communication DSP with a range of hardware co-processors. Such partitioning enables the software flexibility essential for upgradability and long service life of typical M2M devices, while delivering the power efficiency required to support extended battery life of up to ten years. 

As an example, for CEVA licensees developing M2M systems incorporating LTE Cat-1 or Cat-0 today, these systems can be easily upgraded to support LTE Cat-M or other future standards when available. The DSP can also be used to implement proprietary features for specific device use cases, such as seamless indoor and outdoor positioning concurrently with Wi-Fi 802.11n or LTE Cat-0, in a highly efficient manner.
CEVA's Dragonfly reference design combines the XC5 or XC8 DSP cores with the RTOS and communications software needed for IoT designs

“Our Dragonfly reference platform brings together all of the essential hardware, software and system integration components required by customers developing low-power machine-type communication solutions, in a highly cost and power efficient manner,” said Michael Boukaya, vice president and general manager, Wireless Business Unit at CEVA. “We have leveraged our deep expertise in low-power baseband processing and complemented it with a range of software offerings to deliver a platform that is highly customizable and flexible for developing a broad range of IoT and M2M products, quickly and efficiently.”

The Dragonfly reference platform includes the vector communications DSP and all the required co-processors and interfaces, together with software application layers and libraries, RTOS and drivers for MTC systems design. These hardware and software components are available for LTE MTC, Wi-Fi and GNSS standards. Also included is a 500MHz silicon-based development system that includes all of these components together with RF frontends and a host interface. 

“Low-data rate LTE is a key building block of machine-type communications and together with CEVA, we are simplifying the process of integrating LTE connectivity into IoT devices,” said Denis Bidinost, Chief Executive Officer of NextG-Com. “Our ALPSLite protocol stack is the industry’s first 3GPP Cat-0 stack specifically designed for IoT applications and our optimized implementation for the Dragonfly platform leads the industry in terms of power efficiency and reliability.”

“Accurate positioning, both indoor and outdoor, will be a fundamental component of many M2M applications and our CellNav™ technology delivers this accuracy utilizing the existing LTE network infrastructure,” said Rabih Chrabieh, CEO of Nestwave. “Using the CEVA Dragonfly platform, customers can integrate CellNav into their MTC product designs, enabling reliable location tracking in devices that can last years in the field on a single battery.”

“The CEVA Dragonfly reference platform delivers exceptional performance for implementing our software-based GNSS receivers in devices within a stringent power budget,” said Eli Ariel, CEO at Galileo Satellite Navigation. Our Software Receiver solution perfectly complements CEVA’s software-based approach to design flexibility and long service life MTC systems design, allowing customers to carry out performance improvements and new features in the field, including upgrading to future satellite systems.”

Thursday, February 04, 2016

Cisco to buy Jasper Technologies for IoT in $1.4bn deal

By Nick Flaherty

Deal Brings Together Connectivity, Security, Automation and Real-Time IoT Analytics

Consolidation in the embedded industry continues as IoT pioneer Jasper Technologies is to be bought by networking giant Cisco Systems for $1.4bn 

Santa Clara-based Jasper developed a cloud-based IoT service platform to help enterprises launch, manage and monetize IoT services on a global scale and has become a leading IoT service platform with over 3500 enterprise customers and 27 service providers around teh world.

The technology allows companies to connect any device – from cars to jet engines to implanted pacemakers – over the cellular networks of the top global service providers, and then manage connectivity of IoT services through Jasper’s Software as a Service (SaaS) platform.

IoT brings with it many complexities, such as connecting and securing millions of devices and collecting and analyzing data. Jasper simplifies these challenges and helps customers accelerate the shift to IoT by automating the management of IoT services across connected devices.

The proposed acquisition will allow Cisco to offer a complete IoT solution that is interoperable across devices and works with IoT service providers, application developers and an ecosystem of partners. Cisco will continue to build upon the Jasper IoT service platform and add new IoT services such as enterprise Wi-Fi, security for connected devices, and advanced analytics to better manage device usage.
“I am excited about the opportunity for Cisco and Jasper to accelerate how customers recognize the value of the Internet of Things,” said Chuck Robbins, Cisco's Chief Executive Officer. “Together, we can enable service providers, enterprises and the broader ecosystem to connect, automate, manage, and analyze billions of connected things, across any network, creating new revenue streams and opportunities.”

“IoT has become a business imperative across the globe. Enterprises in every industry need integrated solutions that give them complete visibility and control over their connected services, while also being simple to implement, manage and scale,” said Jahangir Mohammed, Jasper Chief Executive Officer. “By coming together, Jasper and Cisco will help mobile operators and enterprises accelerate their IoT success.”   

Mohammed will run the new IoT Software Business Unit under Rowan Trollope, Cisco senior vice president and general manager of the IoT and Collaboration Technology Group. The acquisition is expected to close in the third quarter of fiscal year 2016.

MIT and TI develop hack-proof ferroelectric RFID chips

By Nick Flaherty

Researchers at the Massachusetts Institute of Technology and Texas Instruments have developed a new type of radio frequency identification (RFID) chip that they believe is virtually impossible to hack.
This is a key embedded technology to avoid spoofing an Internet of Things (IoT) network node or accessing contactless credit cards without permission
TI has built several prototypes of the new chip using its ferroelectric process technology to keep power to the device. This is to avoid side-channel attacks that analyze patterns of memory access or fluctuations in power usage when a device is performing a cryptographic operation, in order to extract its cryptographic key.
Interestingly, the MIT research work was also funded by the Japanese automotive company Denso, which raises the question of how it plans to use hack-proof FRAM devices in car systems.
"The idea in a side-channel attack is that a given execution of the cryptographic algorithm only leaks a slight amount of information," said Chiraag Juvekar, a graduate student in electrical engineering at MIT and part of the research team. "So you need to execute the cryptographic algorithm with the same secret many, many times to get enough leakage to extract a complete secret."
One way to thwart side-channel attacks is to regularly change secret keys. In that case, the RFID chip would run a random-number generator that would spit out a new secret key after each transaction. A central server would run the same generator, and every time an RFID scanner queried the tag, it would relay the results to the server, to see if the current key was valid.
Such a system would still, however, be vulnerable to a "power glitch" attack, in which the RFID chip's power would be repeatedly cut right before it changed its secret key. An attacker could then run the same side-channel attack thousands of times, with the same key. Power-glitch attacks have been used to circumvent limits on the number of incorrect password entries in password-protected devices, but RFID tags are particularly vulnerable to them, since they're charged by tag readers and have no onboard power supplies.
Two design innovations allow the MIT researchers' chip to thwart power-glitch attacks: One is an on-chip power supply whose connection to the chip circuitry would be virtually impossible to cut, and the other is a set of "nonvolatile" memory cells that can store whatever data the chip is working on when it begins to lose power.
For both of these issues, the researchers used ferroelectric crystals, which TI is highly experienced in making with its FRAM-based microcontrollers for non-volatile storage. A ferroelectric crystal can also be thought of as a capacitor, an electrical component that separates charges and is characterized by the voltage between its negative and positive poles. Texas Instruments' manufacturing process can produce ferroelectric cells with either of two voltages: 1.5 volts or 3.3 volts.
The researchers' new chip uses a bank of 3.3-volt capacitors as an on-chip energy source. But it also features 571 1.5-volt cells that are discretely integrated into the chip's circuitry. When the chip's power source -- the external scanner -- is removed, the chip taps the 3.3-volt capacitors and completes as many operations as it can, then stores the data it's working on in the 1.5-volt cells.
When power returns, before doing anything else the chip recharges the 3.3-volt capacitors, so that if it's interrupted again, it will have enough power to store data. Then it resumes its previous computation. If that computation was an update of the secret key, it will complete the update before responding to a query from the scanner. Power-glitch attacks won't work.
Because the chip has to charge capacitors and complete computations every time it powers on, it's somewhat slower than conventional RFID chips. But in tests, the researchers found that they could get readouts from their chips at a rate of 30 per second, which should be more than fast enough for most RFID applications.

Wednesday, February 03, 2016

Zeeta Networks to commecialise its IoT operating system

By Nick Flaherty

Zeetta Networks, which focuses on the design, development and marketing of open networking solutions, has received funding of £1.25 million to commercialise the University of Bristol’s software-defined networking technology to smart enterprises and Internet of Things (IoT).
The company, a spin-out from the University’s High Performance Networks group, is an internationally renowned team for their expertise in software-defined networking and network virtualization.     
Zeetta breaks vendor-lock-ins using a unique open networking platform based on industry-standard hardware and powerful orchestration software - named NetOS - which manages, automates and monitors the whole network while significantly reduces its costs. This offers a “USB-like”, plug-n-play management of different types of connected network devices and enables the construction of virtual “network slices”, for example separate logically-isolated sub-networks for the deployment of business-to-business or business-to-consumer services, such as Ultra HD wireless video distribution, city-wide Wi-Fi, IoT and other applications.
The funding, which is being provided by existing investor IP Group and new investor, Breed Reply, means that Zeetta can significantly accelerate its growth plans. This will enable the company to expand its commercial and technical teams and target new markets.
Zeetta Networks has been a virtual member of the Bristol SETsquared Centre since September 2015 and they will be looking to take up residence in the Bristol SETsquared Centre, housed in the iconic Engine Shed, as soon as possible.
“Since the formation of our company last May  we have achieved many impressive milestones including considerable revenues from our first customers and grant funding from the European Commission through our participation in the REPLICATE lighthouse project. The investment from IP Group and Breed Reply cements the confidence of the market in our technology and our team," said Vassilis Seferidis, CEO of Zeetta Networks.
Breed Reply, Reply's advanced incubator, funds and supports the development of start-ups on the Internet of Things (IoT) in Europe and the USA. Based in London, with operational offices in Italy and Germany, Breed Reply supports entrepreneurs and young talent by quickly bringing new ideas to the market. This is done via funding at “seed” and “early stage” level; considerable support with significant know how transfer of business, managerial and technological expertise; and medium-term involvement to establish start-ups in their market. In the IoT sector, the main areas Breed Reply focuses on are fitness and wellness, healthcare, smart home, manufacturing, transportation and energy.

Membrane lasers embedded in CMOS for high speed communications

By Nick Flaherty

A new type of ultra-thin semiconductor laser under development at The University of Texas at Arlington can be embedded into mainstream same silicon substrates to provide increased capacity and energy efficiency.
Weidong Zhou, UTA electrical engineering professor, is working to develop a new ultrathin semiconductor laser that will increase the chip's speed and capacity. A three year grant from the US Army Research Office will develop a membrane laser less than one micron thick that is compatible with planar CMOS platforms. The key innovation is the integration of certain compound semiconductor material with a silicon photonic crystal cavity, which allows a laser to be built directly on a silicon chip next to other electrical components. This leads to higher speed and higher efficiency.
The first application of Zhou’s laser is in data centres, and the group is pursuing various membrane laser architectures for extreme energy efficient computing and communication systems. “We are looking for devices and components to be integrated on a chip,” said Zhou. “As we address electrical injection, integration with other devices on the chip and increased power capabilities, we can begin to apply this technology to products in the medical field or in the consumer arena. These applications could include portable electronics, sensing and imaging equipment, bio applications and wearable electronics.” 
“Big companies like IBM and Intel are using this technology for high-performance computing centres,” Zhou said. “The big push now is for the next big thing: smaller, faster, and less and less power consumption.”

Quantum dot photonics array open up 80THz of new bandwidth for the data centre

By Nick Flaherty

Researchers in the UK and Japan unlocked 80THz of fibre-optic bandwidth that will enable future exascale data centres and transform 5G networks.
The collaboration between the University of Bristol’s Department of Electrical and Electronic EngineeringKeio University and numerous Japanese industrial partners, have designed, developed and prototyped an all-optical router that can unlock 80 THz of bandwidth across a newly defined frequency band named T-Band (thousand band) and O-Band (original band). The adjacent bands span from 1.0 μm (300 THz) to 1.36 μm (220 THz) and are able to support 1600 channels at 50 GHz spacing.
The technology fabricated and tested is based on cascaded arrayed waveguide gratings (AWGs) and is designed to potentially construct a 1600 x 1600 wavelength router that can guide data through. This uses quantum dot chips for the light sources which were originally developed by NICT in Japan.
“The technology and system proposed and prototyped will unlock the new frequency band and networks to support future exascale data centres, ‘zero-latency’ tactile optical internet, internet of everything, smart cities, fog computing and big data infrastructure among others," said Dr Georgios Zervas, Senior Lecturer in Optical and High Performance Networks in the Department of Electrical and Electronic Engineering.
A single passive optical system can interconnect over one million end points such as, broadband home users, IoT devices, data centre servers, while offering at least ten Gb/s per end point. It is also future proof since it’s transparent to any communication signal and it can also potentially consume zero power due to its passive nature says Hiroyuki Tsuda, Professor of Faculty of Science and Technology at Keio University. “The enabling technologies for the new frequency band are the quantum dot based optical devices and the silica planar lightwave circuits designed for the new band," he added.

Bristol wins major V2X driverless car programme

Bristol wins second major driverless car programme:

Bristol is building on its strength in driverless car technology, hosting a £5.5m project to test out communications systems for driverless cars. The three year FLOURISH project will develop secure embedded V2V and V2I (V2X) technologies for driverless systems, working with the University of Bristol.

Importantly, FLOURISH will address the vulnerabilities in the technology operating connected vehicles, with a focus on the critical areas of cyber security and wireless communications. The consortium will seek to develop tools that enable vehicle manufacturers and transport authorities to provide a safe and secure ‘V2X’ communications network that combine vehicle to vehicle (V2V) and vehicle to infrastructure (V2I) technologies.
“FLOURISH is an exciting addition to our portfolio of research in the field of connectivity for autonomous vehicles,” said Professor Andrew Nix, Dean of Engineering at the University of Bristol. “I particularly welcome the opportunity to work closely with Bristol City Council and South Gloucestershire Council on the real-world testing of autonomous vehicles. This will leverage existing investment in the Bristol city region to expand our validation and test capabilities in both urban and inter-urban networked environments.”
Dr Robert Piechocki, Senior Lecturer in the Department of Electrical and Electronic EngineeringCommunication Systems and Networks research group and the University’s project lead, added: “Autonomous cars will rely on secure and dependable wireless connectivity to enable advanced automotive safety features. The FLOURISH project is a unique opportunity to showcase our research that will underpin mobility services of the future.”
Dr Theo Tryfonas, Senior Lecturer in Systems Engineering and a member of the project team, commented: “The security and trustworthiness of wireless connectivity, as well as the privacy of the relevant data in terms of location, personally identifiable attributes of users etc. will be a factor of paramount importance for their successful operation and integration with society.”

From SW Innovation News By Nick Flaherty

Friday, January 29, 2016

NASA to build first integrated-photonics modem

By Nick Flaherty

A team at NASA is to build an integrated photonics modem that that could be a significant step forward in telecommunications, medical imaging, advanced manufacturing and quantum computing.

The agency's first-ever integrated-photonics modem will be tested aboard the International Space Station beginning in 2020 as part of NASA's multi-year Laser Communications Relay Demonstration, or LCRD. The cell phone-sized device incorporates optics-based functions, such as lasers, switches, and wires, onto an etched silicon substrate. 

Once aboard the space station, the so-called Integrated LCRD LEO (Low-Earth Orbit) User Modem and Amplifier (ILLUMA) will serve as a low-Earth orbit terminal for NASA's LCRD using high-speed, laser-based communications.

Since its inception in 1958, NASA has relied exclusively on RF links but with higher data rates than ever before, the need for LCRD has become more critical, said Don Cornwell, director of NASA's Advanced Communication and Navigation Division within the space Communications and Navigation Program, which is funding the modem's development.

LCRD promises data rates 10 to 100 times faster than today's communications equipment, requiring significantly less mass and power. The project, which is expected to begin operations in 2019, is designed to be an operational system after an initial two-year demonstration period. It involves a hosted payload and two specially equipped ground stations.

"Integrated photonics are like an integrated circuit, except they use light rather than electrons to perform a wide variety of optical functions," Cornwell said. Recent developments in nanostructures, meta-materials, and silicon technologies have expanded the range of applications for these highly integrated optical chips. Furthermore, they could be lithographically printed in mass -- just like electronic circuitry today -- further driving down the costs of photonic devices. This will also be a key enabler for photonics-based quantum computing at room temperature.

"We've pushed this for a long time," said Mike Krainak, who is leading the modem's development at NASA's Goddard Space Flight Center in Greenbelt, Maryland. "The technology will simplify optical system design. It will reduce the size and power consumption of optical devices, and improve reliability, all while enabling new functions from a lower-cost system. It is clear that our strategy to leverage integrated photonic circuitry will lead to a revolution in Earth and planetary-space communications as well as in science instruments."

In addition to leading ILLUMA's development, Krainak serves as NASA's representative on the country's first consortium to advance integrated photonics. Funded by the U.S. Department of Defense, the non-profit American Institute for Manufacturing Integrated Photonics, with headquarters in Rochester, New York, is developing low-cost, high-volume, manufacturing methods to merge electronic integrated circuits with integrated photonic devices.
Mike Krainak and his team plan to replace portions of this fibre-optic receiver with an integrated-photonic circuit, whose size will be similar to the chip he is holding. The team then plans to test the advanced modem on the International Space Station. CREDIT: NASA/W. HRYBYK

NASA's Space Technology Mission Directorate (STMD) also appointed Krainak as the integrated photonics lead for its Space Technology Research Grants Program, which supports early-stage innovations. 

Under the NASA project, Krainak and his team will reduce the size of the terminal, now about the size of two toaster ovens (above). Although the modem is expected to use some optical fibre, ILLUMA is the first step in building and demonstrating an integrated photonics circuit that ultimately will embed these functions onto a chip, he said.

"What we want to do is provide a faster exchange of data to the scientific community. Modems have to be inexpensive. They have to be small. We also have to keep their weight down," said Krainak. The goal is to develop and demonstrate the technology and then make it available to industry and other government agencies, creating an economy of scale that will further drive down costs. "This is the pay off," he said.

The technology can also be used in the data centre to reduce size and power consumption. "Google, Facebook, they're all starting to look at this technology," Krainak said. "As integrated photonics progresses to be more cost effective than fibre optics, it will be used," he added. "Everything is headed this way."

Top ten semiconductor R&D companies

By Nick Flaherty

Semiconductor industry spending on research and development grew by just 0.5% in 2015 but was still a record amount of $56.4bn, according to the latest report from IC Insights in the US.

This was the smallest increase since the 2009 downturn year and significantly below the compound annual growth rate (CAGR) of 4.0% over the last 10 years, driven by growing concerns about the weak global economy, slumping sales in the second half of the year, and the unprecedented industry consolidation through a huge wave of merger and acquisition agreements.
Intel continues to lead all semiconductor companies in R&D spending in 2015, accounting for 22% of the industry’s total research and development expenditures. Following Intel in the 2015 R&D ranking are Qualcomm, Samsung, Broadcom (now being acquired by Avago), and the world’s largest wafer foundry, TSMC. The top five spenders were unchanged from 2014, but below that point, the rankings of most companies were shuffled. Micron Technology moved up to sixth in 2015, swapping positions with Toshiba, which fell to seventh in the new ranking. MediaTek went from ninth in 2014 to eighth place, while SK Hynix climbed from 12th to ninth in 2015. ST slid from eighth in 2014 to 10th in 2015, and Nvidia fell out of the top 10 to 11th place in 2015.

The top 10 in the R&D ranking collectively increased spending on research and development in 2015 by about 2% compared to the half-percent increase for total semiconductor R&D expenditures in the year. Combined R&D spending by the top 10 exceeded total expenditures by the rest of the semiconductor companies (about $30.8 billion versus $25.6 billion) in 2015—something that has continued to hold true since 2005.

Intel’s R&D expenditures grew 5% in 2015, which is significantly below its 13% average increase in spending per year since 2010 and slightly under its 8% annual growth rate since 2001, the new report says. Underscoring the growing cost of developing new IC technologies, Intel’s R&D-to-sales ratio has climbed significantly, from 9.3% in 1995 to 16.4% in 2010 and 24.0% in 2015.  .

With worldwide semiconductor sales falling nearly 1% in 2015 to $353.6 billion and R&D spending rising 0.5% to $56.4 billion, the industry’s R&D-to-sales ratio grew slightly to 16.0% from 15.8% in 2014. Since 2000, the semiconductor industry’s annual R&D-to-revenue ratio has average 16.0%. The new McClean Report forecasts semiconductor R&D spending to grow about 4% in 2016 to $58.9 billion and reach $76.3 billion in 2020, which would represent a CAGR of 6.7% from 2015. 

The supermarket of the future - video

By Nick Flaherty

A very interesting video of Intel's view of the supermarket of the future. Using the grammar of a market stall, where you point at what you want, the stall recognises the gesture with the Intel'sRealSense 3D camera and pulls up the data about the product, from allergens to food miles. Demonstrated in Italy and built by consultancy Accenture, there's a lot of IoT and big data behind the interface.

ARM-based server-on-a-chip at heart of key European IoT project

By Nick Flaherty

The X-Gene sergver-on-a-chip from Applied Micro Circuits has been chosen as the exclusive server platform that will be used to develop UniServer, a universal system architecture and software ecosystem for micro-servers. 
The project starts next month and will develop technology that will be ported onto the X-Gene chip and evaluated using smart emerging applications deployed in both classic cloud business data-centers and in newer environments closer to the data sources. The 3-year, 4.8M project is funded by the Horizon 2020 Research and Innovation program in the European Union and includes ARM and IBM
A custom variant of the ARM instruction set, X-Gene was selected for its ability to serve both data-centre infrastructures, as well as small scale centres. As part of the project, AppliedMicro will develop custom-made platforms based on X-Gene that address the complexities associated with Big Data and IoT applications. The data-center-in-a-box will include a number of key features including extensive debug capabilities, explicit support for voltage setting and pre-core frequency scaling, error monitoring, advanced DRAM capabilities and performance counters.
“The goal of UniServer is to improve the energy efficiency, performance, dependability and security of current state-of-the-art micro-servers and AppliedMicro quickly emerged as the platform of choice for this research after an extensive market search,” said Dr. Georgios Karakonstantis, project coordinator and scientific director, UniServer. “With two generations of its X-Gene technology already in production and the third generation sampling later this year, AppliedMicro offers one of the most competitive processors on the market for data center and scale-out environments bringing to UniServer a range of advanced and unique capabilities .”
“AppliedMicro is excited to be a member of the UniServer Consortium to bring a cohesive server platform to the European Union,” said Paramesh Gopi, president and CEO of AppliedMicro. “The pervasiveness and proven ability of the X-Gene platform complements the goal of the UniServer project to improve the performance of servers that run internet and cloud-based services, while also reducing design, implementation costs and power consumption.”
AppliedMicro is joined in the UniServer consortium by partners covering a wide range of scientific, research and engineering expertise, including The Queen’s University of Belfast, University of Athens, Thessaly and Cyprus, Worldsensing, Meritorius and Sparsity. 

Wednesday, January 27, 2016

Falling sales sees ST pull out of set top box business

By Nick Flaherty

The downturn seems to be hitting ST Microelectronics already with a drop in turnover and profits for the last year. This has prompted the company - one of the leading embedded chip makers - to pull out of the set top box chip business where it once led the world.

"After an extensive review of external and internal options for the future of the Company’s set-top box business, ST will discontinue the development of new platforms and standard products for set-top-box and home gateway," said the company. "The slower than expected market adoption of leading-edge products and increasing competition on low-end boxes, combined with the required high level of R&D investment, has led this business to generate significant losses in the course of the last years."

The company was once the dominant supplier of STB chips, battling LSI Logic, but has lost out to lower cost suppliers in recent years. It had been using the transputer architecture acquired from UK firm Inmos as the core controller for these chips, but the move to high end 4K and UltraHD systems has not happened quickly enough to save the business unit as highlighted in today's figures. 

The company saw revenue decline to $6.8bn from $7.4bn last year, with operating profits falling from $258m to $174m, a substantial drop.

"ST’s digital business is at the core of the company’s strategy," it said. "It represents a significant share of ST’s revenues and focuses on growing applications, with a portfolio that includes general purpose and secure microcontrollers, digital automotive products, ASICs and specialized imaging sensors."

As a result, the company is to redeploy about 600 employees currently associated with the set-top-box business, to support ST’s growth ambitions in digital automotive and microcontrollers.

Around 1400 employees will also be affected, with 430 in France through a voluntary departure plan, about 670 in Asia and about 120 in the US. The company had closed is set top box development activity in Bristol, UK, back in 2014.

This will cost $170m but save $170m a year.

Sony positions itself for IoT with Altair buy

English: Official logo of Altair Semiconductor
English: Official logo of Altair Semiconductor (Photo credit: Wikipedia)
By Nick Flaherty

Sony may have been selling off parts of its semiconductor business over the last few years, but it is back on the consolidation trail with a key buy of wireless innovator Altair Semiconductor.

Altair, based in Israel, originally started developing WiMax chipsets, but moved to 4G when WiMax stalled and 4G LTE took over. With 4G well established in silicon, the focus now is on 5G specifications, which include 10Gbit/s download speeds and a challenging 1ms round trip link latency that is aimed at devices in the Internet of Things (IoT).

Both these specifications are music to the ears of Sony, which is looking to combine the bits of its semiconductor business it kept - CMOS image sensors and GPS navigation chips - with wireless links, and Altair has truly leading edge expertise in these areas. The current 4G modem chips and software stand out for their low power consumption, high performance and competitive cost. 

The $212m Altair deal is expected to complete quickly, closing in early February. This is a lot quicker than the re-structuring of its own electronic component business, which is due to complete in April this year. This brings together the semiconductor, battery and storage technologies into  Sony Semiconductor Solutions.

This is part of Sony's vision of more and more "things" equipped with cellular chipsets in a connected environment in which "things" can reliably and securely access network services that leverage the power of cloud computing, it says. It is aiming not only to expand Altair's existing business, but also to move forward with research on and development of new sensing technologies.

Wednesday, January 20, 2016

Microchip underbids for Atmel and wins

By Nick Flaherty

Microchip Technology has booted Dialog Semiconductor out of its deal to buy Atmel with a lower offer but more cash.

Microchip today signed a definitive agreement to acquire Atmel for $8.15 per share in a $3.56bn combination of cash and shares. This compares to the $4.6bn offer from Dialog, although Microchip's cash offer of $7 a share is significantly higher.  
This potentially puts Dialog at risk of being acquired itself. A $137m termination fee will help, but still leaves Dialog vulnerable in this time of manic mergers.  
"Our Board of Directors determined, after consultation with our financial advisor and outside legal counsel, that the transaction with Microchip is a superior proposal for Atmel's stockholders under the terms of our merger agreement with Dialog Semiconductor plc that we terminated today.  Under the Microchip transaction, Atmel stockholders will receive a much higher cash consideration per share compared to the Dialog deal, as well as the opportunity for further upside through the ownership of stock of Microchip," said Steven Laub, President and CEO of Atmel.
Microchip bought Micrel in May last year, and this deal takes out Atmel as a major competitor, consolidating customer business in Microchip as there was more common product lines between the two compared to the Dialog deal (as in the chart on the left). The combination of Atmel and Microchip will have revenue of around $4bn (compared to $3.5bn for the Dialog/Atmel combination), and this will lead to similar estimated cost savings of $170m from April 2018. However, the deal still needs to be approved by Atmel's shareholders, so its not a done deal yet.
"We are delighted to welcome Atmel employees to Microchip and look forward to closing the transaction and working together to realize the benefits of a combined team pursuing a unified strategy.  As the semiconductor industry consolidates, Microchip continues to execute a highly successful consolidation strategy with a string of acquisitions that have helped to double our revenue growth rate compared to our organic revenue growth rate over the last few years.  The Atmel acquisition is the latest chapter of our growth strategy and will add further operational and customer scale to Microchip," said Steve Sanghi, President and CEO of Microchip. 
The transaction has been approved by the Board of Directors of each company and is expected to close in the second quarter of calendar year 2016, subject to approval by Atmel's stockholders, regulatory approvals and other customary closing conditions. 

Tuesday, January 19, 2016

Top 25 passwords still the main security vulnerability

By Nick Flaherty

Despite years of problems, passwords remain the major vulnerability for systems and that is set to get even worse with the smart home and the Internet of Things. Webcams and other smart devices are frequently compromised by weak passwords.

The latest top 25 passwords from Splash Data in 2015 (compared to 2014) will have you holding your head in amazement and despair for the future of the industry, as it seems only the length of the password that is changing:

1. 123456 (Unchanged)
2. password (Unchanged)
3. 12345678 (Up 1)
4. qwerty (Up 1)
5. 12345 (Down 2)
6. 123456789 (Unchanged)
7. football (Up 3)
8. 1234 (Down 1)
9. 1234567 (Up 2)
10. baseball (Down 2)
11. welcome (New)
12. 1234567890 (New)
13. abc123 (Up 1)
14. 111111 (Up 1)
15. 1qaz2wsx (New)
16. dragon (Down 7)
17. master (Up 2)
18. monkey (Down 6)
19. letmein (Down 6)
20. login (New)
21. princess (New)
22. qwertyuiop (New)
23. solo (New)
24. passw0rd (New)
25. starwars (New)

"Given all the recent and historical news on data breaches of personal e-mail accounts, social media accounts and even phone account passwords, it is every wonder therefore that we are still using password combinations that are incredibly easy to guess," said Richard Cassidy, Technical Director for Europe at Alert Logic. "The challenge is Cyber Criminals are well aware that many of their targets still fail to employ a strong password policy and as such will “pre-load” their dictionary attacks for brute-force access with the combinations listed; which in turn means almost instant access to a substantial number of users personal data. Passwords such as these are dangerous because they are the first attempted combinations in the arsenal of attackers brute-force access tools."

"Unfortunately however, even with complex passwords we are almost fighting a losing battle; this is because cyber criminals can access botnet ecosystems to crack encrypted files or password protected data (through hashes of the password, or direct brute force attack) or make use of underground “cracking rigs” that use GPU’s Processors in rigs that can quite literally attempt billions of combinations per second. This means your average 8 character password (mandated by many online systems today) can be cracked in days. A great deal of research has gone into the minimum password length recommended; all users should be choosing passwords of at least 12 characters (alphanumeric with special characters) that are completely random and that would challenge even the most sophisticated decryption rigs for service out there on the cyber criminal underground," he said.
Overall there are two approaches to protecting your data, says Cassidy. First is access to data stores (e-mail, social media, online file sharing) with a minimum of 12 character passwords and second, encrypted key data files with strong cipher algorithms. 

Part of the challenge is devising and managing passwords, says Andy Green, Technical Specialist at Varonis. “People are bad at coming up with their own passwords," he said. "For convenience, we make them obvious or short or both. Hackers are good and getting better all the time at breaking them, either though brute force guessing or dictionary-style attacks if the hackers have access to the password hash. Keep in mind that a password with only six characters can be one of around 200 billion combinations – not a large number in the current era of big data. By increasing your password by only two characters, you’ve increased the possible combinations to almost a quadrillion – which will result in a serious computation challenge for attacker. 

He has some tips. "Sure, you should have at least 8 characters, but better yet use the ‘correct horse battery staple’ method. What’s that?  Essentially, it’s a memory trick where each letter of the password represents a word in a story. So  ’I just wrote a comment about passwords for the press’ becomes ‘Ijwacapftp’.  That’s an unguessable password  for hackers but one that you’ll never forget!”


South West Innovation News - news from across the region for oneof the world's hottest tech clusters