Sign in or Register

Fictron Industrial Supplies Sdn Bhd
No. 7 & 7A,
Jalan Tiara, Tiara Square,
Taman Perindustrian Sime UEP,
47600 Subang Jaya,
Selangor, Malaysia.
+603-8023 9829
+603-8023 7089
Fictron Industrial
Automation Pte Ltd

140 Paya Lebar Road, #03-01,
AZ @ Paya Lebar 409015,
Singapore.
+65 31388976
sg.sales@fictron.com

Latest News

Qualcomm Could Unveil Its Next-gen Flagship Chip

Oct 7, 2019
Qualcomm Could Unveil Its Next-gen Flagship Chip
View Full Size
There's speculation that the chip designer will discover the Snapdragon 865 Mobile Platform. Estimated to be found inside high-end Android devices next year, the new chipset will be produced by Samsung using its 7nm EUV process. The lesser the process number, the more transistors fit inside the chip making it more powerful and energy-efficient. And extreme ultraviolet lithography (EUV) is a more perfect method of marking up a chip die for transistor placement. Qualcomm’s latest top-of-the-line-chipset is the Snapdragon 855+, an overclocked version of the Snapdragon 855 Mobile Platform that offers a 15% improvement in graphics capabilities.
 
Nevertheless, there could be another reason for the announcement. As it turns out, some new Android handsets are believed to be disclosed on Tuesday including a pair from Xiaomi (Xiaomi Mi 9 Pro 5G and Xiaomi Mi MIX Alpha), the Sony Xperia 5 and the Realme X2. Considering this, the buzz around the water cooler indicates that one or more of these devices might be the reason for the teaser that Qualcomm posted yesterday for the upcoming event. All of the aforementioned phones will employ a Snapdragon SoC with the 855+ expected inside the Mi 9 Pro 5G and probably the Mi MIX Alpha. The regular Snapdragon 855 SoC will power the Xperia 5 with the Snapdragon 730G chip driving the X2. There must be a connection between the number “3” used in Xiaomi’s teaser and the three smartphone manufacturers we have been discussing in this paragraph.
 
While Samsung is doing the fab work and developing the Snapdragon 865, Qualcomm will be returning to Taiwan Semiconductor Manufacturing Company (TSMC) for 2021’s Snapdragon 875 Mobile Platform. The world’s largest independent foundry, TSMC rolls chips off the assembly line for companies that design their own chips, but don’t have the facilities to make them. Like for example, both Apple and Huawei design their own SoCs like the A13 Bionic and Kirin 990 respectively. However both rely upon TSMC to churn out the chips they've already designed.
 
As for the Snapdragon 865, traditionally Samsung’s new Galaxy S phones have been the first with a global release to sport the latest Qualcomm Snapdragon chipset, and that more than likely will not change next year. The very first phone to be powered by the Snapdragon 855 Mobile Platform was the Xiaomi Mi 9, but this device was not offered worldwide.
 
2019 remains quite a disruptive year for Qualcomm. It set out with the chip designer in the midst of a feud with Apple and both companies were preparing to square off in court multiple times. Qualcomm also was the defendant in an antitrust case brought by the FTC. The non-jury trial in January was presided over by Judge Lucy Koh (of Samsung v. Apple fame). At the time of the proceedings, Apple and other firms testified against Qualcomm’s sales practices including its “No license, no chips” policy, the computation of royalties based on the retail price of a phone, and its failure to license its standards-essential patents in a Fair, Reasonable and Non-Discriminatory (FRAND) manner.
 
Things took a turn for the better in April (as far as Qualcomm is concerned) just as a court battle with Apple was wrapping up; the two outfits concurred on a settlement. All legal action between the companies was terminated and Apple paid Qualcomm an undisclosed amount believed to be $4.5 billion; in return, Apple received a six-year license (with a two-year option) and a multi-year chip supply agreement.
 
And so Qualcomm sailed along, but only for a month. In May, the decision was in and Judge Koh ruled that Qualcomm had engaged in anticompetitive behavior. Losing this court case can force the chip designer to overhaul its current business practices. And while Judge Koh denied to grant Qualcomm a stay that would allow it to continue the status quo until all of its appeals have been exhausted, last month the Ninth U.S. Circuit Court of Appeals given the stay.
 
If Qualcomm fails to get Judge Koh’s ruling overturned on appeal, it faces the long, complex and difficult task of renegotiating all of the current contracts it has with phone manufacturers. The chip designer asked for the stay because it didn't want to go through this process, win on appeal, and then have to come to terms on a whole new set of contracts.
 

Microsize Lens Pushes Photonics Closer to an On-Chip Future

Oct 7, 2019
Microsize Lens Pushes Photonics Closer to an On-Chip Future
View Full Size
Optical microcomputing, next-generation compact LiDAR units, and on-chip spectrometers all took a step closer to reality with the latest announcement of a new form of optical lens. The lens isn't fabricated from glass or plastic, however. Rather, this low-loss, on-chip lens is made of thin layers of specialized materials on top of a silicon wafer. These “metasurfaces” have shown much promise in recent times as a kind of new, microscale medium for containing, transmitting, and manipulating light.
 
Photonics at the macro-scale is more than 50 years old and has applications today in fields such as telecommunications, medicine, aviation, and agriculture. And yet, shrinking all the elements of traditional photonics down to microscale — to match the density of signals and processing operations inside a conventional microchip — involves completely new optical methods and materials.
 
A team of experts at the University of Delaware, including Tingyi Gu, an assistant professor of electrical and computer engineering, just recently publicized a paper in the journal Nature Communications that describes their effort to make a lens from a thin metasurface material on top of a silicon wafer. Gu says that metasurfaces have likely been made from thin metal films with nanosized structures in it. These “plasmonic” metasurfaces offered the promise of, as a Nature Photonics paper from 2017 put it, “Ultrathin, versatile, integrated optical devices and high-speed optical information processing.”
 
The challenge, Gu says, is that these “plasmonic” materials are not correctly transparent like windowpanes. Traveling just fractions of a micrometer can introduce signal loss of a few of the decibels to tens of dB. “This makes it less practical for optical communications and signal processing,” she says.
 
Her group uses a different kind of metasurface made from etched dielectric materials atop silicon wafers. Making optical elements out from dielectric metasurfaces, she says, could sidestep the signal loss problem. Her group’s paper notes that their lens introduces a signal loss of less than one dB.
 
Even a small improvement (and going from handfuls of dB down to fractions of a dB is more than small) will make a big difference, mainly because a real-world photonics chip might one day have many such components in it. And the more lossy the photonics chip, the bigger the amount of laser power needed to be pumped through the chip. More power means more heat and noise, which might ultimately limit the extent to which the chip could be miniaturized. But with her team’s dielectric metasurface lens, “We can make a device much smaller and more compact,” she says.
 
Her group's lens is made from a configuration of gratings etched in the metasurface — following a wavy pattern of vertical lines that looks a bit like the Cisco company logo. Gu’s group was able to achieve some of the familiar properties of lenses, including converging beams with a measurable focal length (8 micrometers) and object and image distance (44 and 10.1 µm). The group further used the device's lensing properties to achieve some type of optical signal Fourier Transform — and this is a property of classical, macroscopic lenses.
 
Gu says that next steps for their device include exploring new materials and to work toward a platform for on-chip signal processing. “We’re trying to see if we can come up with good designs to do tasks as complicated as what traditional electronic circuits can do,” she says. “These devices have the advantage that they can process signals at the speed of light. It doesn’t need logic signals going back and forth between transistors. … It’s going to be fast.”
 

Apple Increases Production of iPhone 11

Oct 7, 2019
Apple Increases Production of iPhone 11
View Full Size
Apple has assured suppliers to surge their production of its latest iPhone 11 range by nearly 10%, or 8 million units, the Nikkei Asian Review has acquired, as a result of better-than-expected demand worldwide for its new cut-price handset.
 
The grow in orders appears to validate Apple CEO Tim Cook's new strategy of appealing budget-conscious consumers with cheaper models amid the weakening world economy. The order boost of between 7 million and 8 million units is corresponding to total annual phone shipments this year by Google, a rising iPhone rival in Apple's home U.S. market.
 
''This autumn is so far much busier than we expected,'' one source with direct knowledge of the situation said. ''Previously, Apple was quite conservative about placing orders,'' which were less than for last year's new iPhone. ''After the increase, prepared production volume for the iPhone 11 series will be higher compared to last year,'' the source said.
 
Shares of Apple component manufacturers flashy in Japan right after publication of the Nikkei report, outperforming the much wider market. Japan’s Minebea Mitsumi closed up 3%, troubled iPhone screen maker Japan Display rose by around 2%, while Murata Manufacturing and Alps Alpine also gained.
 
Apple announced its three new iPhone models - the iPhone 11, 11 Pro, and 11 Pro Max - in early September, and for the first time in its history reduced the starting price of the model upgrade, despite better cameras, to $699, compared to $749 for last year's iPhone XR. Apple's new budget-conscious strategy came as the global smartphone market is assumed to shrink in general for the third year running, according to research company IDC. In January, Cook acknowledged that ''price is a factor'' behind Apple's slowing sales, especially in emerging markets. 
 
The latest surge in iPhone orders is centered in the cheapest iPhone 11 model and the iPhone 11 Pro model, sources said, while Apple has a little bit revised down orders for its top of the range model, the iPhone 11 Pro Max, which has a starting price of $1,099. Cook recently told German newspaper Bild that he could not be happier with the iPhone 11 launching and that its sales had enjoyed a ''very strong start.'' Apple's share price has advanced approximately 40% this year and is now close to its October 2018 record high.
 
Nevertheless, suppliers remain cautious and said they were anxious that the higher level of orders wouldn't be endured. ''Demand is good for now. But we have to be careful not to be too optimistic,'' one executive-level source told the Nikkei. ''I hope that this year's peak season lasts longer than last year.''
 
One element that may have helped stimulate demand momentarily is that Apple's iPhone 11 is still manufactured in China, and Washington has briefly postponed a planned 10% tariff on China-made electronic imports. The wait in the tariff hike, from September to December 15, will help demand during the Thanksgiving and Christmas shopping seasons. Donald Trump has hardened his posture on trade talks with China, saying in September he did not want an interim truce.
 
Nevertheless, the uptick in iPhone orders is a welcome change in recent fortunes for California-based Apple.
 
Just the previous year, Apple asked key iPhone assemblers Foxconn, which previously trades as Hon Hai Precision Industry, and Pegatron to call off additional production only weeks after the iPhone XR hit the shelves. Subsequently, this January, Apple made an infrequent move when it cut its quarterly sales forecast, blaming soft iPhone demand in China as its economy slowed. A continuous rise in demand now would therefore counter the 2018 drop in iPhones sales - the company's first since the iconic handset first launched in 2007.
 
''Apple's pricing strategy this year so far turns out to have boosted some initial sales and preorders. ... However, given the weakening world economy and uncertainties ahead, we are concerned whether the good demand will last long,'' said Chiu Shih-Fang, a veteran smartphone analyst at Taiwan Institute of Economic Research.
 
''Even if the second half is definitely better than the first half, we need to monitor if the lower average sale price could have an impact on Apple's total revenue.''
 
Yasuo Nakane, head of global tech research at Mizuho Securities, said he had revised up 2019 iPhone manufacturing estimates to 194 million units from 178 million - although that is still lower than the 208.8 million iPhones sold in 2018. All models in the iPhone 11 range have better cameras than last year's, and are loaded with facial recognition and wireless charging features similar to 2018.
 
Compared with rivals Samsung Electronics, Huawei Technologies, Xiaomi and Oppo, Apple did not introduce 5G compatibility, the next generation wireless communication standard that allows faster data transfer and low latency. In the first half of 2019, Apple suffered a nearly 25% slump in iPhone shipments, in comparison to 2018, according to IDC - far worse than its main rivals, Samsung Electronics and Huawei Technologies. The world's top two smartphone makers, respectively saw an almost 2% drop and a nearly 26% surge in shipments over the same period.
 

How Much Power Will Quantum Computing Need

Oct 4, 2019
How Much Power Will Quantum Computing Need
View Full Size
Google’s Quantum AI Lab has installed the advanced generation of what D-Wave Systems defines as the world’s first commercial quantum computers. Quantum computing can probably solve specific problems a lot quicker than today’s classical computers while using relatively less power to perform the calculations. Yet the energy efficiency of quantum computing always continues to be a mystery.
 
At the moment, D-Wave’s machines can scale up the number of quantum bits (qubits) they use without significantly increasing their power requirements. That’s simply because D-Wave’s quantum computing hardware rely on a specialized design consisting of metal niobium loops that act as superconductors when chilled to a frigid 15 millikelvin (-273°  C). Much of the D-Wave hardware’s power consumption — a little lower than 25 kilowatts for the latest machine — goes toward running the refrigeration unit that keeps the quantum processor cool. The quantum processor itself necessitates a comparative pittance.
 
“The operation of the quantum processor itself requires remarkably little power—only a tiny fraction of a microwatt—which is essentially negligible in comparison to the power needs of the refrigerator and servers,” says Colin Williams, director of business development & strategic partnerships at D-Wave Systems.
 
The new 1000-qubit D-Wave 2X machine installed at Google’s lab has around double the qubits of its predecessor, the D-Wave Two machine. But the minimal volume of power used by the quantum processor means that “the total system power will still remain more or less constant for many generations to come” even as the quantum processor scales up to thousands of qubits, Williams says. D-Wave can currently get away with this because the same “cryostat” unit that uses so many kilowatts of power would still be sufficient to cool much larger quantum processors than the ones previously in use. 
 
''It would be similar if you attach a large cooling device to your PC that uses many kilowatts of power — you would barely see an increase in power consumption when going to larger systems since the power is dominated by the large cooling infrastructure,'' says Matthias Troyer, a computational physicist at ETH Zurich.
 
The ability to scale up a D-Wave machine’s computing capabilities with no need of increasing its power consumption may sound appealing. But it actually doesn’t say much about the power efficiency of quantum computing compared with classical computing. Today’s D-Wave machines perform about as well as a high-end PC on particular unique tasks, but they use way more power based on their extreme cooling requirements. (High-end computing cores require just tens of watts of power.)
 
“While the ‘flat power requirement’ is a good statement to make for marketing, it is unclear at the moment what the true power needs are once the device is optimized and scaled up,” Troyer says. “Right now they need orders of magnitude more power than competing classical technology.”
 
However, this isn't completely a fair comparison, Troyer says. “On the power side, they are currently losing,” he says. But the D-Wave machine “is not engineered to be power saving. It may pay off again at some point.”
 
Scott Aaronson, a theoretical computer scientist at MIT and a D-Wave critic, seemed bemused by the idea of D-Wave having a power advantage of any sort. Pertaining to D-Wave’s reliance on a crygenic cooler he wrote in an email: “It’s amusing chutzpah to take such a gigantic difficulty and then present it as a feature.” He talked about that D-Wave may require an even more power-hungry cooling system to create lower temperatures that increase its quantum processors’ chances of a “speedup” advantage over classical computing in the future.
 
D-Wave’s quantum annealing machines exemplify just one possible computer architecture for quantum computing. These are generally designed to solve a specialized set of “optimization problems” rather than act as universal logic-gate quantum computers. (The latter would be super-fast versions of today’s classical “gate-model” computers.) Google’s Quantum AI Lab has invested in both D-Wave’s machines and in exploring development of universal logic-gate quantum computers.
 
In due course, Troyer expects power requirements for quantum computing to perhaps be “linearly proportional” to the number of qubits and their couplings, as well as proportional to the number of times operators must run and recool the system before it finds the solution.
 
Quantum computing’s high strengths most likely won’t begin to emerge until engineers build machines with many thousands or possibly millions of qubits. That’s still a ways off even for D-Wave, which has chosen to scale up the number of qubits in its processors pretty quickly. Most quantum computing researchers have opted for an even slower approach of building quantum computing devices with just several qubits or tens of qubits, because of major challenges in correcting for qubit errors and maintaining coherence across the system.
 
However, both D-Wave and independent quantum computing labs share the identical general goal of building machines that can exploit the “spooky physics” of quantum physics. Quantum computers could potentially perform many more calculations at the same time than classical machines. If quantum computers can conquer classical computers in terms of “time to solution,” they could also prove more power-efficient at the end of the day.
 
“If a quantum device can solve a problem with much better [time to solution] scaling than classical computing, it would also win on power,'' Troyer says.
 

Making Mobile Robots More Flexible in the Plant

Oct 4, 2019
Making Mobile Robots More Flexible in the Plant
View Full Size
MIT Professor David Mindell has invested his career defying traditional distinctions between disciplines. His work has discovered the ways humans interact with machines, drive innovation, and maintain societal well-being as technology transforms our economy.
 
Mindell’s experience blending fields of study has shaped his points about the relationship between humans and machines. Those points are what led him to found Humatics — a startup named from the merger of “human” and “robotics.” Humatics is trying to change the way humans work alongside machines, by allowing location tracking and navigation indoors, underground, and in other areas where technologies like GPS are limited. It accomplishes this by using radio frequencies to track things at the millimeter scale — unlocking what Mindell calls microlocation technology.
 
The company’s solution is already being utilized in places such as shipping ports and factories, where humans work along with cranes, industrial tools, automated guided vehicles (AGVs), and many other machines. These businesses often lack consistent location data for their machines and are forced to implement inflexible routes for their mobile robots.
 
“One of the holy grails is to have humans and robots share the same space and collaborate, and we’re enabling mobile robots to work in human environments safely and on a large scale,” Mindell says. “Safety is a critical first form of collaboration, but beyond that, we’re just beginning to learn how to work [in settings] where robots and people are exquisitely aware of where they are.”
 
The interdisciplinary perspective Mindell developed at MIT has helped him identify the shotcomings of technology that restrict machines and humans from working together flawlessly. One specific shortcoming that Mindell has thought about for years is the lack of precise location data in places like warehouses, subway systems, and shipping ports.
 
“In five years, we’ll look back at 2019 and say, ‘I can’t believe we didn’t know where anything was,’” Mindell says. “We’ve got so much data floating around, but the link between the actual physical world we all inhabit and move around in and the digital world that’s exploding is really still very poor.”
 
In 2014, Mindell partnered with Humatics co-founder Gary Cohen, who has worked as an intellectual property strategist for biotech companies in the Kendall Square area, to solve the problem. In the beginning of 2015, Mindell collaborated with Lincoln Laboratory alumnus and radar expert Greg Charvat; the two built a prototype navigation system and initiated the company two weeks later. Charvat became Humatics’ CTO and first employee.
 
“It was clear there was about to be this huge flowering of robotics and autonomous systems and AI, and I thought the things we learned in extreme environments, notably under sea and in aviation, had an enormous amount of application to industrial environments,” Mindell says. “The company is about bringing insights from years of experience with remote and autonomous systems in extreme environments into transit, logistics, e-commerce, and manufacturing.”
 
Bringing microlocation to industry
 
Factories, ports, and other locations where GPS data is unworkable or lack of adopt a variety of solutions to meet their tracking and navigation needs. But each workaround has its drawbacks. RFID and Bluetooth technologies, as an example, can track assets but have short ranges and are expensive to deploy across large areas.
 
Cameras and sensing methods similar to LIDAR can be used to help machines detect their environment, but they struggle with things like rain and different lighting conditions. Floor tape embedded with wires or magnets is also often used to guide machines through fixed routes, but it isn’t well-suited for today’s a lot more dynamic warehouses and production lines.
 
Humatics has aimed at making the capabilities of its microlocation location system as easy to leverage. The location and tracking data it accumulates can be structured into whatever warehouse management system or IOT platforms customers are previously using. Its radio frequency beacons have a range of up to 500 meters and, when installed as part of a constellation, can determine three dimensional locations to within 2 centimeters, creating a virtual grid of the surrounding environment.
 
The beacons can be merged with an onboard navigation hub that helps mobile robots move around dynamic environments. Humatics’ system also accumulates location data from different points at once, monitoring the speed of a forklift, helping a crane operator place a shipping crate, and guiding a robot around obstacles at the same time. The data Humatics assembles also turns the way workers and machines share space and work together.
 
Indeed, with a new chip just emerging from its labs, Mindell says Humatics is moving industries such as manufacturing and logistics into “the world of ubiquitous, millimeter-accurate positioning.”
 

PC Maker HP To Cut Up To 9,000 Jobs In Restructuring Push

Oct 4, 2019
PC Maker HP To Cut Up To 9,000 Jobs In Restructuring Push
View Full Size
U.S. personal computer maker HP Inc stated on Thursday that it will cut up to 16% of its workforce during a restructuring plan planned at cutting costs. The company will cut about 7,000 to 9,000 jobs through a combination of employee exits and voluntary early retirement, it said in a statement.
 
HP estimates the plan will result in annual gross run rate savings of about $1 billion by the end of fiscal 2022, it added. The company had about 55,000 workforce globally as of Oct. 31, according to a filing with the U.S. Securities and Exchange Commission. That would mean as many as 16% targeted in the cuts, Reuters calculation revealed.
 
Relating to the restructuring, HP said it expects to incur an overall charge of over $1 billion, of which $100 million will be realized when it reports its fourth-quarter earnings. “We are taking bold and decisive actions as we embark on our next chapter,” said Enrique Lores, the company’s incoming chief executive officer.
 
“We see significant opportunities to create shareholder value and we will accomplish this by advancing our leadership, disrupting industries and aggressively transforming the way we work.”
 
Lores will take over the CEO position on Nov. 1 from Dion Weisler. Palo Alto, California-based HP also said its board on Sept. 30 endorsed an additional $5 billion in share buybacks. HP expects to generate free cash flow of at least $3 billion in fiscal 2020 and return at least 75% to shareholders through a 10% quarterly dividend increase and share buybacks, it added.
 
The company said it desires its adjusted earnings in the range of $2.22 to $2.32 per share for fiscal 2020. For the current fiscal year, it expects adjusted earnings to be in range of $2.18 to $2.22, the company said when reporting its third-quarter earnings. HP’s shares have fallen about 10% this year up to Thursday’s close.
 

Samsung Pulls The Plug On Chinese Smartphone Production

Oct 4, 2019
Samsung Pulls The Plug On Chinese Smartphone Production
View Full Size
Samsung this morning affirmed with Reuters that it has shuttered mobile phone manufacturing in China. The move comes as the company keeps to struggle in the world’s No. 1 smartphone market.
 
As we observed in a deeper dive into China’s mobile phone sales back in August, the Korean hardware giant has struggled to retain a market share in the low single digits. It’s not alone, of course; Apple, too, has suffered an uphill effort to break into the market, which is dominated by homegrown names, including Huawei, Vivo, Oppo and Xiaomi.
 
Sales have been driven by a mixture of pricing and, in the case of embattled Huawei, patriotic buying decisions. Samsung has over time phased out manufacturing in the country in the last year, suspending operations in some plants, preceding subsequently pulling the plug altogether. The news follows exactly the same move by Sony. Apple, meanwhile, is sustaining its production in the country for now.
 
More recently, Samsung has gazed to others countries, including India and Vietnam, which have undercut China’s production costs. The company will, however, continue selling phones in China, even as it eyes other cheaper locations for manufacturing.
 

How Language Shapes Password Security

Oct 3, 2019
How Language Shapes Password Security
View Full Size
It does not matter the dissimilarities in language and culture, both Chinese- and English-language Internet users obviously find common ground in using easily guessable password variants of “123456.” However a recent study comparing password patterns among the two languages also found notable and unique features in Chinese passwords that have big implications for Internet security beyond China.
 
The password habits of Chinese-language users have been amazingly understudied given that they make up more than 20 percent of all Internet users worldwide. A little over 854 million people use the Internet in China alone — more than double the entire population of the United States. That's the reason a group of Chinese and U.S. researchers set out to test how password security among both Chinese- and English-language users stands up on the best cracking algorithms.
 
“Our work may be among the first studies to examine the passwords of different languages,” says Ding Wang, an information security researcher at Peking University, in Beijing.
 
Wang and his peers analyzed 106 million real passwords from nine Web services — 73 million passwords from six Chinese-language services and 33 million passwords from three English-language services — unveiled by hackers and leaked online between 2009 and 2012. They were careful to directly compare the security of passwords only from similar Web service counterparts among the mix of social forums, gaming services, e-commerce websites, and programmer forums, plus the Yahoo Internet portal on the English-language side of the data set. Their results appear in a paper [PDF] presented at the 28th USENIX Security Symposium held in Santa Clara, Calif., from 14 to 16 August.
 
What may seem like a strong password based on English-language assumptions could actually be quite weak and easy to guess from a Chinese-language perspective. Yet many of the world’s popular Web services, including some homegrown Chinese services, approach password security from an English-language perspective.  
 
The specialists pointed to the example of the popular Chinese password “woaini1314” that is currently rated “strong” by password strength meters used by AOL, Google, and even the well known Chinese social network Sina Weibo (and by IEEE Spectrum’s parent organization, IEEE). And yet speakers of Mandarin Chinese, the most popular spoken dialect of Chinese, can very quickly guess the “woaini1314” password because “woaini” in Chinese pinyin (romanized system of Chinese characters) means “I love you,” and “1314” sounds like “forever” in Chinese.
 
One key difference between Chinese-language and English-language passwords is that many Chinese-language users favor passwords consisting entirely of digits. Beyond the infamous “123456” password, other popular passwords among Chinese-language users include “111111,” “123123,” and “123321.” Playing on the love theme, “5201314” is used because it sounds just like the phrase “I love you forever and ever” in Chinese. Some popular password segments will add a letter to the string of digits, such as “a12345” and “12345a.”
 
Chinese-language users also frequently use their mobile phone numbers or certain dates (perhaps their birthdays) in passwords — something that English-language users don’t do as often. Instead, English-language users usually compose passwords made solely of letters and lean toward certain words or phrases such as the easily guessable “password,” “letmein,” “sunshine,” and “princess.” Some of the most popular passwords include “abcdef” and “abc123” alongside “123456.”
 
Passwords that use only digits are less difficult to crack than passwords made only of letters because the digit combinations are based on just 10 possible digits as opposed to 26 letters in the modern English alphabet. But Chinese-language speakers quite often demonstrated incredibly complex and creative passwords: Some members of the Chinese Software Developer Network (CSDN) service combined programming language commands with traditional Chinese poems.
 
“Chinese users can be really creative with combinations of letters and digits,” says Yuan Tian, a computer scientist at the University of Virginia in Charlottesville, Va., and coauthor on the study. 
 
The password files used by researchers contained hashes of leaked or taken passwords, not plain-text versions of the passwords themselves. The researchers attempted to decode both Chinese-language and English-language passwords using two state-of-the-art algorithms for cracking passwords. They tested the Markov-chain model, which assigns various probabilities to password characters based on their relationships with one another, and the probabilistic context-free grammars (PCFG) model, which parses passwords into letter segments, digit segments, and symbol segments before estimating the order of the most likely combinations.
 
The team also upgraded the PCFG approach by customizing it to account for certain password patterns more common to Chinese-language users. To illustrate, they added number segments in the popular date format and Chinese names as written in the romanized Pinyin system. They will also gave their PCFG-based algorithm the capability to process the interleaving patterns — strings of switching digits and letters — found in so many Chinese passwords.
 
Together, those efforts boosted the modified PCFG-based algorithm’s performance versus the Chinese password data sets — it cracked between 98 percent and 188 percent more passwords than the standard version of the algorithm.
 
The results also pointed out primary strengths and weaknesses of Chinese-language passwords in comparison with English-language passwords. Both types of algorithms cracked more of the easier Chinese passwords when compared to English passwords when limited to 10,000 or a fewer amount of guess attempts. But the remaining Chinese passwords proved stronger than their English password counterparts as the number of guesses grew beyond 10,000 attempts.
 
The number of guesses matters because a few Web services limit the number of online guesses before momentarily locking a user’s account. Leaked or stolen password storage files can allow hackers to make a theoretically unlimited amount of offline guessing attacks because they do not have to handle possibly being locked out of a Web service. But even offline guess attacks are still limited by the cost-effectiveness of spending computing time and resources on numerous guess attempts.
 
It’s also clear that individual Chinese-language speakers can do themselves a favor by preventing using predictable digit patterns such as “123456” and “111111” for their passwords, along with the predictable letter and letter/digit hybrid patterns based on romantic themes of eternal love. (The same goes for English-language speakers still using “123456” and “abcdef”—just stop!)
 
The complexities of language’s influence on passwords may go even much deeper within just the Chinese-language community. Chinese-language users commonly rely on the same set of Chinese characters for reading and writing, but spoken Chinese has many different regional differences based on local dialects that can sound different when it's about pronunciation. As just a good example, the pronunciation of “I love you” in Mandarin Chinese — considered mainland China’s official national language — does sound different from the pronunciation of the same phrase in the Cantonese branch of Chinese spoken by so many people living in or originating from places such as Hong Kong, Macau, and Guangdong.
 
Those regional distinctions in spoken Chinese were beyond the scope of this special study. But Tian observed that there may just be differences in password patterns if speakers of Cantonese, Hokkien, Shanghainese, or other regional variants of Chinese tried making passwords based on pronunciation.
 
Together with a deeper dive, researchers hope to continue evaluating Chinese-language password patterns by using surveys to better understand what Chinese Internet users are thinking when creating their passwords. And they raised the possibility of continuing their comparative studies of passwords in different languages beyond just Chinese and English. “For our future work, we want to cover passwords around the world beyond China,” Wang says.
 

Google One Paid Storage Service Gets More Phone Backup Features

Oct 3, 2019
Google One Paid Storage Service Gets More Phone Backup Features
View Full Size
If you do not have a Google One membership yet, you will probably consider one after the latest improvements the paid storage service has just received. The Mountain View company recently announced that Google One is now getting additional phone backup features on Android devices.
 
Google has taken the standard Android backup, which includes texts, contacts, and apps, and modified with extra features like automatic phone backup. On top of that, Google One users will be able to back up original quality photos, videos, and multimedia messages (MMS).
 
Not only that, consumers will be given the option to manage their backups directly from the Google One app. The storage service enables users to get everything they need to be restored when they set up their next Android phone.
 
Google One users should expect the new automatic phone backup feature to become available to them this week. Simply just open the Google One app and head to the “Device Backup” section to check out the new improvements.
 

Chips Meet Chalk As Fujitsu's AI Scoring Comes To Gymnastics

Oct 3, 2019
Chips Meet Chalk As Fujitsu's AI Scoring Comes To Gymnastics
View Full Size
A worldwide gymnastics tournament taking place in Germany this month has adopted Fujitsu's artificial intelligence in assessing performances, the Japanese technology group said Wednesday.
 
The Artistic Gymnastics World Championships, opening Friday in Stuttgart, will likely be the first tournament to apply Fujitsu's technology in an official capacity, as reported by the company. The system may well be put to use in the 2020 Tokyo Olympics, depending upon its effectiveness.
 
The AI system applies sensors to observe a gymnast's movements and angle measurements and develop 3D renderings of the performance. The tech will be deployed in conjunction with video replay when a senior judge or a national team asks for a review of the scoring.
 
Fujitsu produced the system with the International Gymnastics Federation through a venture that began in 2017. The technology is also expected to be used for injury risk assessment and to correct a gymnast's form while in training. For this upcoming tournament, the system will support judging for the pommel horse, still rings, and the men's and women's vault. The tech will soon be applied to other six events at a later date.
 

In-house Chips Power Huawei's 5G Drive Beyond China

Oct 3, 2019
In-house Chips Power Huawei's 5G Drive Beyond China
View Full Size
Chinese telecom equipment manufacturers claim they have overcome technical and standards-related troubles to establish themselves as worldwide leaders in 5G, bringing customers from all over the world even amidst a tough geopolitical environment.
 
Ken Hu, rotating chairman of Huawei Technologies, indicated his confidence in the company's superiority when he said in June that efforts to set up fifth-generation wireless networks in Europe can be delayed for two years if Huawei base stations were excluded.
 
Speaking to reporters at the MWC telecommunications expo here - just about a month following Huawei was slapped with U.S. sanctions - Hu declared that the company had struck deals with 50 wireless carriers outside China, of which, 28 were European while 11 deals came from the Middle East. The remainder included 6 deals in the Asia Pacific. He attributed Huawei's allure to its cutting-edge technology and cost advantage.
 
Base stations, which receive and transmit signals, portray the core of wireless infrastructure and contain the greatest advances in telecom technology. Each carrier needs a unique individualized equipment, making this a bit more lucrative field than smartphones, Huawei's other main business. Huawei presented new equipment for 5G base stations in February, featuring antennas that weigh just 20 kg - small enough to be installed by a single person, a presenter said. The new platforms pack the antenna and control systems right into one small unit that is also suitable for 2G, 3G and 4G.
 
Miniaturization is of crucial importance to carriers making the jump to 5G. There are at least 300,000 4G base stations across Japan - and more than 10 times that many in China - which have to be upgraded, and smaller equipment cuts down on labor and construction costs. Huawei's advances in this area owe to its surmounting two long-standing challenges.
 
One was its deficiency of internally developed processors for base stations. It now has its own chips produced by subsidiary HiSilicon Technologies, which was established in 2004 to help Huawei become more autonomous in semiconductors. The new chipset showcased in January handles data faster than other processors and allows for the footprint of base station antennas to be cut in half.
 
''Our company has amassed huge amounts of the underlying technology [for base stations], from chips to materials and liquid coolant,'' said Li XiaoXun, head of the technological strategy department at Huawei's Japanese arm. Huawei's other big disadvantage lay in its use of a different type of wireless transmission system than that preferred by most of the world - something that is now essentially working in its favor.
 
In the 2000s, Huawei and its largest Chinese peer, ZTE, started developing TDD - or time division duplex - technology, through which incoming and outgoing signals share the identical spectrum. At the time, other countries favored FDD, or frequency division duplex, which transmits and receives signals on separate frequency bands.
 
In 5G networks, data is sent with pinpoint precision to individual devices to avoid signals from interfering with each other even when large numbers of devices are concentrated in a small area. This is easier to implement with China's TDD than with FDD. ''Huawei's technical capabilities have been on par with those of European manufacturers during the 4G era,'' said Hiroyuki Morikawa, a professor at the University of Tokyo and a member of a Japanese 5G standards-setting body.
 
The Chinese company ''can potentially gain even more momentum in the 5G era'' by leveraging its overwhelming cost advantage, he said. European companies, mostly Ericsson and Nokia, long played a leading role in setting telecom standards, with Chinese manufacturers forced to pay hefty fees to license the necessary technology. But Chinese players have since vaulted to the top of the technological heap, thanks partly to a campaign by the Chinese government for bigger adoption of TDD that began about 2013.
 
Morikawa said Huawei has sent big numbers of employees to international meetings on standardization to communicate intimately with officials and companies from other countries. Brisk Chinese demand has boosted the competitiveness and financial wherewithal of equipment makers there. After the Chinese government issued 5G licenses to telecom operators in June, state-owned China Mobile published a plan at MWC Shanghai that month to install over 50,000 base stations in over 50 cities in 2019.
 
China lagged a little bit behind in approving 5G licenses, being the fifth country to do so. But its three big state-owned wireless companies got a head start by setting up test facilities, basically in major cities, before being legally licensed. The largest, China Mobile, combined with Huawei to install base stations in a railway station and a mall in Shanghai. Huawei has poured $4 billion into 5G development over the past decade and plans to keep investing intensely in this area, according to Hu. Compatriot ZTE is also going on the offensive with its base station business.
 
''Ours are 30% cheaper than those of our European competitors,'' said Chen Liangwen, chief technology officer at ZTE Japan. Morikawa suggested that deeper pockets, rather than better technology, is the secret to Chinese telecom equipment makers' success. ''Their ability to put money earned from their massive domestic market into research and development is an advantage, but their actual technology isn't that different from that of other manufacturers,'' he said. 5G is a primary element of the Chinese government's ''Made in China 2025'' industrial modernization initiative. ''China's aim in building up its 5G network is upgrading its industry,'' said IHS Markit analyst Mitsue Oba.
 
With 100 times the speed of 4G technology, 5G is supposed to facilitate technologies such as remotely operated industrial robots and autonomous vehicles. Chinese companies' advances in infrastructure are helping switching the country into a 5G superpower. The country's national 5G push aims to ''send a message that it has surpassed developing nations to achieve the world's highest level of telecom technology,'' Morikawa said.
 

Top 5 Digital Transformation Trends In Manufacturing For 2020

Oct 2, 2019
Top 5 Digital Transformation Trends In Manufacturing For 2020
View Full Size
One of the most appealing things about digital transformation is that it moves quickly — and slowly — all at the same time. The tendencies we see on the horizon for Industry 4.0 in 2020 are quite similar as trends that we've seen growing … revolutionizing … during the last few years. The difference is not so much in the technology, it’s in the number of companies beginning to utilize it, and the reasons why.
 
Case in point: the Internet of Things. Technologists, have been talking about the enormous IoT benefit for manufacturing, literally, for some time. Going into the 2020s, however, it’s not only the technologists that are spouting the benefits. It is the manufacturing companies themselves discerning the significant affect of connecting one’s work and processes to the IoT. It is also the significantly demanding consumers who want higher quality goods, often with responsible manufacturing practice, right now this very moment. If there’s one thing that’s clear, it’s that manufacturers will be experiencing increasing pressures in cost, efficiency and quality in the coming decade. And in addition they're finding that new tech adoption—be it IoT, 5G, AI, enterprise resource planning, or VR/AR training—is the only way to survive.
 
The following are some of the biggest technologies continuing to expand in 2020, and the factors why.
 
Internet of Things: Less I and More AI
 
First, can we all celebrate the fact that we've (basically) dropped the additional I from the IIoT? Increasingly, digitization isn’t happening by industry — it is taking place everywhere. The IoT used in manufacturing overlaps at innumerable points with IoT in retail, consumer goods, healthcare, martech, and simply about everything else. In fact, the continuous interplay of all those avenues of data and connectivity are providing exceedingly important insights that are changing the way manufacturing is being run. Besides that, we are seeing a convergence of AI and IoT, with some companies like SAS Software touting AIoT as the next wave for IoT based on Gartner’s prediction of more than 80% of IoT projects encompassing AI.
 
For instance, yes, the IoT promises cost savings. It helps provide insights on processes, costs, productivity, etc. But in the mean time, it is also providing information about the supply chain — the quality of parts and products being used, where they came from, and how they were grown, bought, or created. More and more, customers are demanding that the things they buy are manufactured responsibly. And manufacturers — not just for brands selling the products being manufactured — are being held accountable for those details thanks to the IoT.
 
Research from MPI Group have discovered approximately 70% of manufacturers credit the IoT with elevating their profitability. Research reveals manufacturing companies will invest some $267 billion by 2020. Clearly, they are starting to get the message that the technology can provide incredible value for them. Another noteworthy data point is that 90% of manufacturing companies in the United States today have fewer than 500 employees, according to the National Association of Manufacturers. Will they have the capacity to invest in, and support employees knowledgeable of, the IoT? It is questionable. And also it may be the one thing that causes small manufacturers to drop out of the digital transformation game altogether.
 
Predictive Everything
 
Research reveals a single hour of downtime can equate to $100,000 in losses in a manufacturing environment. Using data, AI, and predictive analytic, some say manufacturers can reduce planned outages by 50%. IBM says it can even reduce unplanned outage by 15%. Predictive analytics help companies better understand how their machine work, and why they fail, which allows them to prevent those failures altogether. Going forward: not just a nice to have, but a must-have for manufacturing environments.
 
Definitely, manufacturers today are operating in an environment that is full of risks and unknowns — how will the market change? How will it be damaged? Where will their business take them, geographically? Will they be able to find partners in those areas that share the same level of commitment to quality as they do? With so many global variables at hand, predictive analytics can help manufacturers make better, smarter, faster, and less risky decisions about everything from machine maintenance to supply chain optimization, all of which impacts customer experience; from the quality of goods produced to when customers receive orders.
 
5G
 
Yes, we are now ultimately hitting an age when 5G will play a role in improving (reducing) latency, providing high bandwidth, and allow for quality realtime communication on an extensive scale. With 5G, manufacturers can begin to improve their use of sensor, cloud, centralized tracking, quality inspection, etc., forming an “ecosystem” of smart manufacturing. Absolutely, we may see an ever growing disparity between 5G have and have-nots in 2020 (much like IoT). But it will certainly play a bigger role in smart manufacturing moving forward.
 
The big trends in the digital transformation of manufacturing are likely to be rounded out with technology like 3-D printing will continue to allow companies to make faster, cheaper prototypes while AR and VR will continue to allow for better, safer training across the board. These are not really new trends, but rather areas of continuous improvement for manufacturing.
 
It is essential to also reiterate the growing connectedness of consumer demands that will play a way more significant role in changing manufacturing for the better in the coming year than those technologies themselves. Today, every company is here to serve the customer, no matter how far away from the customer they may have functioned in the past. Transformational trends such as the (A)IoT and 5G will force them to do that even more in the coming decade. It will also make that level of accountability possible.
 

The 7 Biggest Technology Trends In 2020 Everyone Must Get Ready For Now

Oct 2, 2019
The 7 Biggest Technology Trends In 2020 Everyone Must Get Ready For Now
View Full Size
We are in the middle of the 4th Industrial Revolution, and technology is evolving a lot faster than ever. Companies and individuals that don't stay up with some of the major tech trends run the risk of being left behind. Understanding the main key trends will allow people and businesses to prepare and grasp the opportunities. This article is going to deal with the seven most forthcoming trends everyone should be prepared for in 2020.
 
AI-as-a-service
 
Artificial Intelligence (AI) is among the most transformative tech evolutions of our times. As highlighted in the book 'Artificial Intelligence in Practice', Bernard Marr says, most companies have begun to explore how they can use AI to improve the customer experience and to streamline their business operations. This will remain in 2020, and while people will more and more become used to working alongside AIs, designing and deploying our own AI-based systems will remain an expensive proposition for most businesses.
 
For that reason, much of the AI applications will remain to be done through providers of as-a-service platforms, which allow us to basically feed in our own data and pay for the algorithms or compute resources as we use them.
 
At present, these platforms, provided by the likes of Amazon, Google, and Microsoft, are usually somewhat broad in scope, with (often expensive) custom-engineering required to apply them to the specific tasks an organization may require. During 2020, we will see wider adoption and an ever-increasing pool of providers that are most likely to start offering more tailored applications and services for specific or specialized tasks. This will mean no company will have any excuses left not to use AI.
 
5G data networks
 
The 5th generation of mobile internet connectivity is going to give us super-fast download and upload speeds as well as more stable connections. While 5G mobile data networks turned into available for the first time in 2019, they were largely still expensive and limited to functioning in confined areas or major cities. 2020 is going to be the year when 5G really starts to fly, with more affordable data plans as well as severely improved coverage, meaning that everyone can join in the fun.
 
Super-fast data networks will not simply give us the ability to stream movies and music at higher quality when we are on the move. The drastically increased speeds mean that mobile networks will become more usable even than the wired networks running into our homes and businesses. Companies must consider the business implications of having super-fast and stable internet access anywhere. The enhanced bandwidth will enable machines, robots, and autonomous vehicles to collect and transfer more data than ever, leading to advances in the area of the Internet of Things (IoT) and smart machinery.
 
Autonomous Driving
 
While we still are not at the stage where we can be expecting to routinely travel in, or even see, autonomous vehicles in 2020, they will certainly continue to generate a vital amount of excitement.
 
Tesla chief Elon Musk has said he expects his company to create an absolutely “complete” autonomous vehicle by this year, and the number of vehicles efficient at operating with a lesser degree of autonomy – such as automated braking and lane-changing – will become a very common sight. On top of this, other in-car systems not directly connected to driving, such as security and entertainment functions – will become significantly automated and reliant on data capture and analytics. Google's sister-company Waymo has just completed a trial of autonomous taxis in California, where it transported more than 6200 people in the first month.
 
It will not just be cars, of course – trucking and shipping are becoming more autonomous, and breakthroughs in this space are likely to continue to hit the headlines throughout 2020.
 
Along with the maturing of autonomous driving technology, we will also increasingly hear about the measures that will be taken by regulators, legislators, and authorities. Changes to laws, existing infrastructure, and social attitudes are all going to be required before autonomous driving becomes a practical reality for most of us. During 2020, it is most likely we will start to see the debate around autonomous driving spread outside of the tech world, as an increasng number of people come round to the idea that the question is not ''if,'' but ''when,'' it will become a reality.
 
Personalized and Predictive Medicine
 
Technology is currently transforming healthcare at an unprecedented rate. Our ability to capture data from wearable devices such as smartwatches will give us the ability to progressively predict and treat health issues in people even before they experience any symptoms. 
 
Regarding medication, we will see a whole lot more personalized approaches. This is also referred to as precision medicine which allows doctors to more accurately prescribe medicines and apply treatments, thanks to a data-driven understanding of how effective they are probably going to be for an individual patient.
 
Whilst not a new idea, thanks to recent breakthroughs in technology, mainly in the fields of genomics and AI, it is giving us a greater understanding of how different people’s bodies are better or worse equipped to fight off certain diseases, as well as how they are inclined to react to different types of medication or treatment. Throughout 2020 we will see new applications of predictive healthcare and the introduction of more personalized and effective treatments making sure better outcomes for individual patients.
 
Computer Vision
 
In computer terms, “vision” involves systems that have the ability to identify items, places, objects or people from visual images – those collected by a camera or sensor. It is this technology that allows your smartphone camera to acknowledge which part of the image it is recording is a face, and powers technology such as Google Image Search.
 
As we move through 2020, we will surely see computer vision equipped tools and technology rolled out for an ever-increasing number of uses. It is typical to the way autonomous cars will “see” and navigate their way around danger. Production lines will employ computer vision cameras to watch for defective products or equipment failures, and security cameras will be able to alert us to anything out of the ordinary, without requiring 24/7 monitoring.
 
Computer vision is also enabling face recognition, which we will hear quite a lot about in 2020. We have already seen how useful the technology is in controlling access to our smartphones in the case of Apple's FaceID and how Dubai airport uses it to provide a smoother customer journey. In spite of this, as the use cases will grow in 2020, we will also have more debates about limiting the use of this technology because of its potential to erode privacy and enable 'Big Brother'-like state control.
 
Extended Reality
 
Extended Reality (XR) is a catch-all term that addresses several new and emerging technologies being used to create more immersive digital experiences. More specifically, it refers to virtual, augmented, and mixed reality. Virtual reality (VR) provides a fully digitally immersive experience where you enter a computer-generated world using headsets that blend out the real world. Augmented reality (AR) overlays digital objects onto the real world via smartphone screens or displays (think Snapchat filters). Mixed reality (MR) is an extension of AR, that means users can connect with digital objects placed in the real world (think playing a holographic piano that you have placed into your room via an AR headset).
 
These technologies have been around for a number of years now but have primarily been confined to the world of entertainment – with Oculus Rift and Vive headsets providing the current state-of-the-art in videogames, and smartphone features just like camera filters and Pokemon Go-style games providing the most visible examples of AR.
 
From 2020 expect all of that to change, as businesses get to grips with the wealth of exciting possibilities offered by both current forms of XR. Virtual and augmented reality will end up as progressively prevalent for training and simulation, as well as offering new ways to interact with customers.
 
Blockchain Technology
 
Blockchain is a technology trend that I have covered substantially this year, ''said Bernard Marr, and yet you are still likely to get blank looks if you mention it in non-tech-savvy company. 2020 could finally be the year when that changes, though. Blockchain is mostly a digital ledger used to record transactions but secured due to its encrypted and decentralized nature. During 2019 some commentators started to argue that the technology was over-hyped and perhaps not as useful as first thought. Nevertheless, continued investment by the likes of FedEx, IBM, Walmart and Mastercard during 2019 is probably to start to show real-world results, and if they manage to prove its case, could quickly lead to an increase in adoption by smaller players.
 

PayPal To Enter China Payments Through Local Acquisition

Oct 2, 2019
PayPal To Enter China Payments Through Local Acquisition
View Full Size
U.S. digital money transfer platform PayPal Holdings Inc has acquired Beijing's permission for buying a controlling stake in a domestic payments firm, which would make PayPal the first foreign firm to enter China's payment services market.
 
Gopay Information Technology, PayPal's acquisition target, has acquired affirmation from China's central bank to sell a 70% stake to PayPal, both companies said on Monday.
 
Gopay has licenses for mobile, online and cross-border yuan payment services, the Chinese company said in its statement. PayPal will make the purchase through a subsidiary in Shanghai. No financial terms were disclosed.
 
The transaction is anticipated to close in the fourth quarter of 2019 and is based on customary closing conditions, PayPal said. Early last year, China's central bank announced that it was opening the country's domestic market to foreign third-party electronic payment firms, a move anticipated to promote competition in the retail payments industry.
 

Could Airships Rise Again?

Oct 2, 2019
Could Airships Rise Again?
View Full Size
Transportation produces about one-fourth of global anthropogenic carbon emissions. From this, maritime shipping accounts for 3 percent, and this number is predicted to increase for the next three decades even though the shipping industry is actively searching for greener alternatives, and developing near-zero-emission vessels.
 
Researchers with the International Institute for Applied Systems Analysis (IIASA), in Austria, not long ago explored another prospective solution: the return of airships to the skies. Airships count on jet stream winds to propel them forward to their destinations. They offer clear advantages over cargo ships in regards to both efficiency and avoided emissions. Returning to airships, says Julian Hunt, a researcher at the IIASA and lead author of the new study, could “ultimately [increase] the feasibility of a 100 percent renewable world.”
 
In modern times, world leaders are meeting in New York for the U.N. Climate Action Summit to present plans to address climate change. Already, average land and sea surface temperatures have soared to almost 1 degree Celsius above pre-industrial levels. If the current rate of emissions continues to be unchecked, the Intergovernmental Panel on Climate Change estimates that by 2052, temperatures could rise by up to 2 degrees C. At this stage, as much as 30 percent of Earth’s flora and fauna could disappear, wheat production could fall by 16 percent, and water would become more scarce.
 
As outlined by Hunt and his collaborators, airships could play a role in cutting future anthropogenic emissions from the shipping sector. Jet streams flow in a westerly direction with the average wind speed of 165 kilometers per hour. On these winds, a lighter-than-air vessel could travel around the world in about two weeks (while a ship would take 60 days) and require just 4 percent of the fuel consumed by the ship, Hunt says.
 
The IIASA-led study, conducted by a team from Austria, Brazil, Germany, and Malaysia, also proposes combining the transport of hydrogen apart from cargo on airships that use hydrogen as the lifting gas. “An aspect that will substantially increase the viability of airships and balloons is the development of a hydrogen economy,” Hunt says. Given that airships are actually a clean technology (the only by-product of hydrogen-based fuels is water), using them to also transport hydrogen could offer a further advantage of this transportation mode.
 
Nonetheless, many people still associate airships with the tragic Hindenburg disaster in 1937, when a hydrogen explosion resulted in 36 deaths and earned airships a reputation as quite unsafe. The hydrogen-filled airships mentioned in the IIASA study do have a potential risk of explosion, and aren't ideal for passenger travel.
 
But Hunt says that using the new materials for example graphene that are lighter, more durable, and more flame resistant, the risks are somewhat lower. He also suggests taking extra security precautions, including flying routes that avoid heavily populated areas. And, says Hunt, if problems do occur while an airship is operating, “the airship should be evacuated and blown up in the stratosphere” and also come backed up with a mechanism to safely unload any other cargo (with parachutes, for instance).
 
Most modern airships, though, use helium rather than hydrogen as the lifting gas, including those developed by the company Flying Whales, which aims to start delivering cargo to remote areas in 2023, and the hybrid airships built by Lockheed Martin for Straightline Aviation.
 
“Helium is an inert gas,” says Mike Kendrick, CEO of Straightline Aviation, which is expecting its helium-based hybrid cargo airships to lift off in early 2021. “[It’s] not quite as efficient as hydrogen [and more expensive] but much safer.” (The hybrid vessels made by Lockheed Martin, unlike conventional airships, are heavier than air.)
 
The most energy-intensive part of airship operations comes when its lifting gas has to be pressurized to allow for descent. If the energy released during depressurization (during lift) can be stored and reused during descent, every trip could theoretically have a net zero energy use. And any hydrogen fuel onboard could possibly be supplemented by solar power arrays on the airship’s surface, says Hunt.
 
“If you try to estimate the cost [of using airships for cargo] now, it would be 10 to 50 times more expensive than ships, [which] are a mature technology that has been developing for hundreds of years,” Hunt says. For airships to be competitive with conventional shipping, he adds, the cargo industry would have to invest at least US $50 to $100 billion over the next decade or two in developing the technologies required making it a safe and practical options for transportation.
 

The Key To Successful AI: Hiding Its Use From People

Sep 30, 2019
The Key To Successful AI: Hiding Its Use From People
View Full Size
AI is demonstrating itself superior to human intelligence in an stretching number of fields. That is, except when people know AI is being used.
 
Yes, seeing that in certain human-centric sectors, the performance of artificial intelligence starts to drop off if people are apprised of the involvement of an intelligent machine. As a matter of fact, human resistance would seem to be the achilles heel of artificial intelligence, since for all the recent advances of AI technology this resistance is preventing AI from doing its job in areas where human contact and interaction would usually play a central role.
 
This message was brought home recently by a study published in Marketing Science on September 20, titled ''The Impact of Artificial Intelligence Chatbot Disclosure on Customer Purchases.'' In it, an international team of researchers found that chatbots used by an unnamed financial services company were as effective in selling products as proficient sales employees, but also four times more effective than inexperienced workers. The thing is, when customers were informed before any conversation that they would be speaking to a chatbot, the ability of the AI-based assistants to encourage customer purchases fell by a massive 79.7%.
 
''Our findings show when people don't know about the use of artificial intelligence (AI) chatbots they are four times more effective at selling products than inexperienced workers, but when customers know the conversational partner is not a human, they are curt and purchase less because they think the bot is less knowledgeable and less empathetic,'' said co-author Xueming Luo, a professor in marketing at Temple University.
 
Similar results have emerged from other studies. Earlier this month, researchers from NYU and Boston University found that patients were less receptive to AI-based healthcare provision and were less likely to trust artificially intelligent healthcare services over human providers. This perhaps is not surprising, but it also isn't particularly rational, since AI has been shown in experiments to be at least equal with human experts in diagnosing medical conditions based on images, for example.
 
Moreover, in the context of money, research published by enterprise software firm VMware in early 2019 concluded that only 19% of people in the U.K. would be happy providing AI with a role in managing their finances. This is despite the fact that there's already evidence of the superiority of AI-based investment, with data from Eurekahedge revealing that the annualized rate of return for 14 AI-driven hedge funds is 12.74% (as of August 2019), whereas the average rate for the large hedge funds of its ''Eurekahedge 50'' is only 5.12%. And more broadly, the average annual rate of return for AI funds over the decade from 2010 to 2019 was 13.18%, while the ten-year average for the Eurekahedge 50 was only 5.4% (and the ten-year average for Eurekahedge's main index of 2,500 funds was 4.9%).
 
In other words, our in-built prejudice against AI is preventing the technology from being used and applied more regularly, and from reaching its fullest potential. Meanwhile, the Marketing Science suggests one of two ways out of this predicament: either by hiding the use of AI from people entirely, or from working to build trust gradually via incremental exposure to AI.
 
In some respects, firms engaged in AI are probably already hiding–or at least underplaying–its use of the technology. There is evidence that people are interacting with artificial intelligences without even knowing it, as indicated by a 2017 Pega survey which revealed that, while only 33% of people believe they use AI technology, around 77% actually do so. Likewise, a 2018 BarclayHedge survey indicated that around 56% of hedge funds are relying on artificial intelligence to some degree to inform their investment decisions, even though the vast majority of people are still wary about trusting AI with their finances.
 
And yet, while it may end up being effective to quietly ''force'' AI on the public, it's not likely to be a viable strategy in the long term. In July, the State of California passed a law requiring companies to ensure that chatbots disclose themselves to consumers, paving the way for similar rules to be passed in other states and possibly at a federal level.
 
As a result, the tactic of secretly infiltrating chatbots and AI into wider society looks like it’s already endangered. Instead, the AI and tech industry will have to settle for building trust in artificial intelligence over time, by conducting more studies demonstrating AI's efficacy, by gradually rolling out AI-based services in a piecemeal fashion, and by striving constantly to make AI more explainable to the general public. It's only by doing this that society will come to permit a more central role for artificial intelligence, and that AI will have a positive effect on us rather than an uncertain and potentially negative one.
 

Russia Rolls Out Red Carpet For Huawei Over 5G

Sep 30, 2019
Russia Rolls Out Red Carpet For Huawei Over 5G
View Full Size
While the US banished Huawei for alleged espionage and asked its allies to do the same, Moscow has rolled out the red carpet for the Chinese tech company, allowing it develop 5G networks in Russia. Analysts say the action is as much a show of solidarity with Beijing against the US as it is a drive to bring ultra high-speed Internet to Russian tech users.
 
This month, Huawei set up its first 5G test zone in Moscow in partnership with Russian operator MTS, with a view to rolling out the service to the rest of the capital. Moscow authorities say the network will come to be part of the city's standard infrastructure within the next few years. A pioneer in telecoms networks as compared to some Western countries, Russia plans to use 5G in all of its main cities by 2024.
 
When Chinese President Xi Jinping visited Russia in June - at the height of Washington's conflict with Huawei - Russia's main operator MTS signed a contract with the Chinese company. At the inauguration of the 5G zone in Moscow, the CEO of Russia's branch of Huawei Zhao Lei acknowledged the company's activities in the country.
 
''We have been working in Russia for 22 years. Thanks to our partners, we live well here,'' he said. He added that Huawei, considered a world leader in 5G technology, plans to ''lead in the development of 6G'' in the future.
 
Huawei is also the world's second-largest smartphone company. A source in Russia's 5G research community explained Huawei is the largest investor in the development of mobile technologies in Russia, with ''the largest research laboratory of all operators'' in Moscow.
 
Reported by the Vedomosti business daily, Huawei at present employs 400 people in Moscow and 150 in Saint Petersburg in mobile research and development. It aspires to employ 500 more people by the end of 2019 and 1,000 more over five years. Experts say Russia's welcome of Huawei does not mean the Chinese company is alone in the race for developing 5G in Russia.
 
''Russian operators are all collaborating with multiple 5G equipment vendors, Huawei included. We do not see any clear 5G leaders in the network deployment in Russia,'' said Michela Landoni, an analyst at Fitch Solutions.
 
She said operators prefer this approach to avoid being ''reliant on one specific vendor'' and to protect themselves against cyber threats. The Tele2 operator was the first to launch 5G in Russia with Sweden's Ericsson in August, on Moscow's main Tverskaya street.
 
In the middle of a trade war and technological rivalry with China, the US has confronted to cut Huawei's access to the US components and services it needs, such as the Android operating system that the company uses on its phones. Russia then promptly stepped in to offer its Aurora operating system to the Chinese group.
 
If Android is still Huawei's preferred choice, Ms Landoni said Aurora could be a ''short-term solution'' for the group. In line with the analyst, Aurora could become a ''stepping stone'' in the development for Huawei's own OS.
 
According to Sylvain Chevallier, a partner at the technology consulting firm BearingPoint, the target is ''to create an economic front against the US''. Russia and China, he said, are trying to break away from the US monopoly over smartphone operating systems.
 

Google CEO Talks Huawei, Regulations and Company Size

Sep 30, 2019
Google CEO Talks Huawei, Regulations and Company Size
View Full Size
In spite of its commanding presence as a technology titan, Google faces no shortage of challenges. U.S. President Donald Trump has publicly questioned its work with China. There are increasing calls to better protect the large amount of personal user data Google handles, and authorities in the U.S. have launched an antitrust investigation into some of its activities.
 
''We are not building a search project for China. I think we have been very clear on that for a while now,'' CEO Sundar Pichai told Nikkei in an interview. Google's former plans to re-enter mainland China with a censored version of the Google search engine has met aggressive criticism from the Trump administration and Congress.
 
''I think the U.S.-China bilateral trade conversations are really important,'' Pichai said, revealing his intend that the two sides ''get it right in a way that works for their citizens.''
 
Regarding the company's business relationship with Huawei Technologies, Pichai said that its Android is an open source platform and still available to the Chinese smartphone maker. ''We do everything consistent with the law, to help sustain that ecosystem.''
 
In regards to Huawei developing its own mobile operating system after the U.S. government banned American companies from exporting items to the company, Pichai said ''Huawei is a very successful company, and I think they will have initiators.''
 
''I fully expect Huawei to serve its users, and I think we all have to adjust to the realities of the trade situation and work accordingly,'' he said.
 
A new struggle facing Google is the patchwork of regulatory regimes rising in several countries. A self-described ''tech optimist,'' Pichai accented the importance of balancing regulation and innovation. ''There is some inherent tension between countries, rightfully, being worried about safeguarding its own citizens and better protecting them,'' he said. ''And helping balance that against a connective, open, and free internet'' that creates new opportunities.
 
''Depending on how early your technology is, you want to make sure you can innovate, because technology can also solve important problems.''
 
One area where regulations are beginning to come into view is data privacy. Privacy rules give ''a clear framework for what users can expect, what businesses need to comply with, and I think it's good to have standards and frameworks for technology,'' Pichai said.
 
Such regimes can become a burden for Google, which derives more than 80% of its revenue through digital advertising. But the company believes that privacy regulations are not avoidable. He pointed to the European Union's General Data Protection Regulation as an example of a framework that gives users a clear idea of how their privacy is protected, and companies ideas on how to build products. ''Hopefully it's a template for the rest of the world,'' he said, suggesting that a unified, global standard would be perfect.
 
Problems arise when different countries and regions create their own frameworks, creating a patchwork of regulations. Japan, the U.S., and India are all considering the issue, and California is set to enact statewide regulations next year. Pichai pushed back against the charge that Google has become too big. ''At a high level, stepping back, we're a company at scale. The internet works at scale.''
 
''We have hundreds of researchers who work on AI in health care, to help better detect and treat diseases, as an example,'' he said, also pointing to the company's work on AI and cyber security. ''The scale at which you need to do these things involves tremendous R&D, and the need to do it at scale.''
 
In answer to criticism over the company's aggressive merger and acquisition strategy, Pichai stressed that Google was not seeking to monopolize any market and that it still competes with other companies in several areas. ''We are behind some other companies in cloud,'' he said as an example.
 
Authorities have been cracking down on tech giants, with about 40 U.S. states launching antitrust probes into Google and Facebook this month. The company will most likely continue at odds with the government when it comes to its scale and activities for some time.
 
As to artificial intelligence, Pichai is more cautious when it comes to regulation, at least for now. The field ''is very much in its infancy, and we also are approaching a lot of our AI work in an open way,'' he said. Google's position is that regulating current forms of AI, which can only handle certain tasks like image recognition or translation, will not eliminate bias and other issues with the technology.
 
But ''I expect robust AI regulation to be there'' over time as the field progresses toward general purpose AI, Pichai said. He cited Google's AI principles published last year, and said each company needs to consider about its ethical standards for the technology.
 

Baidu Wins China's First Commercial License for Self-driving Buses

Sep 30, 2019
Baidu Wins China's First Commercial License for Self-driving Buses
View Full Size
Chinese search engine giant Baidu is among three companies to succeed a license from the city of Wuhan to operate a commercial transportation service using self-driving vehicles, in a first for China. Authorities hail the move as the starting of the world's first 5G-based driverless commercial service.
 
Also presented a license were Shanghai-based DeepBlue Technology, which operates self-driving buses on a trial basis in Tianjin and in other places, and Shenzhen Haylion Technologies, an autonomous-driving technology development unit of a state-owned bus company. Licenses for self-driving transportation have been issued in China before, but only for trial services. The country is emerging as a hotbed of development for autonomous vehicles, which rely upon massive amounts of data to learn the rules of the road. 
 
The three companies are allowed to provide transportation on a total of 28 km of public roads in Wuhan, the capital of Hubei Province in central China. They are expected to use generally buses. Suppliers of the necessary infrastructure such as fifth-generation wireless networks include state-run China Mobile and telecom equipment giant Huawei Technologies.
 
Wuhan's endeavors in 5G autonomous-driving technology will serve as a model for other cities, a Huawei executive was quoted as saying in local reports. A China Mobile official expressed confidence in the service, which uses China's BeiDou satellite positioning system.
 
''The combination of 5G and BeiDou will keep any delay to a matter of thousandths of a second, and allow for centimeter-level positioning accuracy,'' the official said.
 
Wuhan, home to state-owned Dongfeng Motor, looks to expand the public roads available for the new services to 159 km, covering 90 sq. km of area. Baidu last year introduced the commercial launch of what it called the world's first mass-produced self-driving bus.
 

Samsung Reveals New Image Sensor

Sep 27, 2019
Samsung Reveals New Image Sensor
View Full Size
In a world where “bigger” pixels and “deeper” pixels influence discussions around smartphone sensors, Samsung has decided to go… another way. The image sensor giant has uncovered the world’s first 0.7µm-pixel mobile image sensor, which allowed it to pack 43.7 megapixels into a sensor that is less than 5mm wide. The Samsung ISOCELL Slim GH1 is incredibly tiny.
 
In order to really combine this tiny size with enough resolution to appeal to the spec hounds of 2019, Samsung took advantage of its Tetracell technology that makes the Slim GH1 two-sensors-in-one by using a quad-bayer filter to group squares of four pixels together.
 
In good light, a “remosaicing” algorithm lets for this image sensor to shoot full 43.7MP images; when the light drops and those tiny 0.7µm pixels would cause significant problems with noise, the quad-bayer design enables the sensor to produce better, 10.9MP images with “higher light sensitivity the same as that of a 1.4µm-pixel image sensor.” Best of both worlds.
 
And since this scaled down Tetracell resolution remains plenty to cover 4K, the ISOCELL Slim GH1 promises “more detailed backgrounds when recording high-resolution videos or selfies at 60T frames per second (fps).”
 
This is an extra impressive achievement for Samsung, who has been making some major strides in the smartphone image sensor game with its ultra-high resolution 64MP and 108MP ISOCELL Bright image sensors that take advantage of the same Tetracell technology. For the ISOCELL Slim, Samsung simply flipped the script, using the same technology to load up more resolution and performance into a smaller chip.
 
The hope is that this sensor “will enable sleeker and more streamlined designs as well as excellent imaging experiences in tomorrow’s smartphones.” Oh, and if you are pondering whether this tech will soon make it into “real” cameras, the answer seems to be yes: Sony has already developed a full-frame Quad-Bayer version of the 61MP in the Sony a7R IV, which could bring the same “best of both worlds” approach to much larger, more capable cameras.
 

You have 0 items in you cart. Would you like to checkout now?
0 items
Switch to Mobile Version