Google's Diversity Chief: Mamas Don't Let Their Baby Girls Grow Up To Be Coders 405

Posted by samzenpus
from the starts-at-home dept.
theodp writes: Explaining the reasons for its less-than-diverse tech workforce, Google fingered bad parenting for its lack of women techies. From the interview with Google Director of Diversity and Inclusion Nancy Lee: "Q. What explains the drop [since 1984] in women studying computer science? A. We commissioned original research that revealed it's primarily parents' encouragement, and perception and access. Parents don't see their young girls as wanting to pursue computer science and don't steer them in that direction. There's this perception that coding and computer science is ... a 'brogrammer' culture for boys, for games, for competition. There hasn't been enough emphasis on the power computing has in achieving social impact. That's what girls are interested in. They want to do things that matter." While scant on details, the Google study's charts appear to show that, overall, fathers encourage young women to study CS more than mothers. Google feels that reeducation is necessary. "Outreach programs," advises Google, "should include a parent education component, so that parents learn how to actively encourage their daughters."

GM's Exec. Chief Engineer For Electric Vehicles Pam Fletcher Answers Your Question 107

Posted by samzenpus
from the read-all-about-it dept.
Pam Fletcher was propulsion system chief engineer on the first Chevrolet Volt plug-in hybrid and is now executive chief engineer for electrified vehicles at GM, overseeing electrified vehicles company-wide. A while ago you had a chance to ask about her work and the future of electric cars. Below you'll find her answers to your questions.

The Reason For Java's Staying Power: It's Easy To Read 413

Posted by samzenpus
from the easy-on-the-eyes dept.
jfruh writes: Java made its public debut twenty years ago today, and despite a sometimes bumpy history that features its parent company being absorbed by Oracle, it's still widely used. Mark Reinhold, chief architect for the Oracle's Java platform group, offers one explanation for its continuing popularity: it's easy for humans to understand it at a glance. "It is pretty easy to read Java code and figure out what it means. There aren't a lot of obscure gotchas in the language ... Most of the cost of maintaining any body of code over time is in maintenance, not in initial creation."

Hydrogen-Powered Drone Can Fly For 4 Hours at a Time 116

Posted by samzenpus
from the different-way-to-fly dept.
stowie writes: The Hycopter uses its frame to store energy in the form of hydrogen instead of air. With less lift power required, its fuel cell turns the hydrogen in its frame into electricity to power its rotors. The drone can fly for four hours at a time and 2.5 hours when carrying a 2.2-pound payload. “By removing the design silos that typically separate the energy storage component from UAV frame development - we opened up a whole new category in the drone market, in-between battery and combustion engine systems,” says CEO Taras Wankewycz.

Energy Dept. Wants Big Wind Energy Technology In All 50 US States 256

Posted by Soulskill
from the any-way-the-wind-blows dept.
coondoggie writes: Bigger wind turbines and towers are just part of what the U.S. needs in order to more effectively use wind energy in all 50 states.That was the thrust of a wind energy call-to-arms report called "Enabling Wind Power nationwide" issued this week by the Department of Energy. They detail new technology that can reach higher into the sky to capture more energy and more powerful turbines to generate more gigawatts. These new turbines are 110-140 meters tall, with blades 60 meters long. The Energy Department forecasts strong, steady growth of wind power across the country, both on land and off shore.

Google Offers Cheap Cloud Computing For Low-Priority Tasks 59

Posted by Soulskill
from the guaranteed-7-7s-of-uptime dept.
jfruh writes: Much of the history of computing products and services involves getting people desperate for better performance and faster results to pay a premium to get what they want. But Google has a new beta service that's going in the other direction — offering cheap cloud computing services for customers who don't mind waiting. Jobs like data analytics, genomics, and simulation and modeling can require lots of computational power, but they can run periodically, can be interrupted, and can even keep going if one or more nodes they're using goes offline.

Marvel's Female Superheroes Are Gradually Becoming More Super 228

Posted by Soulskill
from the superduper-heroines dept.
New submitter RhubarbPye writes: A new study shows an increasing trend in the power and significance of female superhero characters in the Marvel comic book universe. Several criteria were used to examine the trend, including cover art, dialog, and the actual superpowers. Over 200 individual comic books from Marvel's 50+ year history were compared for the study. What's of particular interest is the study's author is a 17-year-old high school student from Ohio.

AMD Details High Bandwidth Memory (HBM) DRAM, Pushes Over 100GB/s Per Stack 98

Posted by timothy
from the lower-power-higher-interest dept.
MojoKid writes: Recently, a few details of AMD's next-generation Radeon 300-series graphics cards have trickled out. Today, AMD has publicly disclosed new info regarding their High Bandwidth Memory (HBM) technology that will be used on some Radeon 300-series and APU products. Currently, a relatively large number of GDDR5 chips are necessary to offer sufficient capacity and bandwidth for modern GPUs, which means significant PCB real estate is consumed. On-chip integration is not ideal for DRAM because it is not size or cost effective with a logic-optimized GPU or CPU manufacturing process. HBM, however, brings the DRAM as close to possible to the logic die (GPU) as possible. AMD partnered with Hynix and a number of companies to help define the HBM specification and design a new type of memory chip with low power consumption and an ultra-wide bus width, which was eventually adopted by JEDEC 2013. They also develop a DRAM interconnect called an "interposer," along with ASE, Amkor, and UMC. The interposer allows DRAM to be brought into close proximity with the GPU and simplifies communication and clocking. HBM DRAM chips are stacked vertically, and "through-silicon vias" (TSVs) and "bumps" are used to connect one DRAM chip to the next, and then to a logic interface die, and ultimately the interposer. The end result is a single package on which the GPU/SoC and High Bandwidth Memory both reside. 1GB of GDDR5 memory (four 256MB chips), requires roughly 672mm2. Because HBM is vertically stacked, that same 1GB requires only about 35mm2. The bus width on an HBM chip is 1024-bits wide, versus 32-bits on a GDDR5 chip. As a result, the High Bandwidth Memory interface can be clocked much lower but still offer more than 100GB/s for HBM versus 25GB/s with GDDR5. HBM also requires significantly less voltage, which equates to lower power consumption.

New Chips Could Bring Deep Learning Algorithms To Your Smartphone 40

Posted by samzenpus
from the smarter-smart-phone dept.
catchblue22 writes: At the Embedded Vision Summit, a company called Synopsys, showed off a new image-processor core tailored for deep learning. It is expected to be added to chips that power smartphones, cameras, and cars. Synopsys showed a demo in which the new design recognized speed-limit signs in footage from a car. The company also presented results from using the chip to run a deep-learning network trained to recognize faces. A spokesperson said that it didn't hit the accuracy levels of the best research results, which have been achieved on powerful computers, but it came pretty close. "For applications like video surveillance it performs very well," he said. Being able to use deep learning on mobile chips will be vital to helping robots navigate and interact with the world, he said, and to efforts to develop autonomous cars.

Wind Turbines With No Blades 164

Posted by Soulskill
from the can-finally-take-them-through-airport-security dept.
An anonymous reader writes: Wired has a profile of Spanish company Vortex Bladeless and their unusual new wind turbine tech. "Their idea is the Vortex, a bladeless wind turbine that looks like a giant rolled joint shooting into the sky. The Vortex has the same goals as conventional wind turbines: To turn breezes into kinetic energy that can be used as electricity." Instead of relying on wind to push a propeller in a circular motion, these turbines rely on vorticity — how wind can strike an object in a particular way to generate spinning vortices of air. Engineers usually try to avoid this — it's what brought down the Tacoma Narrows Bridge. But this Spanish company designed the turbine computationally to have the vortices occur at the same time along its entire height. "In its current prototype, the elongated cone is made from a composite of fiberglass and carbon fiber, which allows the mast to vibrate as much as possible (an increase in mass reduces natural frequency). At the base of the cone are two rings of repelling magnets, which act as a sort of nonelectrical motor. When the cone oscillates one way, the repelling magnets pull it in the other direction, like a slight nudge to boost the mast's movement regardless of wind speed. This kinetic energy is then converted into electricity via an alternator that multiplies the frequency of the mast's oscillation to improve the energy-gathering efficiency."

In-Database R Coming To SQL Server 2016 94

Posted by Soulskill
from the r,-me-hearties dept.
theodp writes: Wondering what kind of things Microsoft might do with its purchase of Revolution Analytics? Over at the Revolutions blog, David Smith announces that in-database R is coming to SQL Server 2016. "With this update," Smith writes, "data scientists will no longer need to extract data from SQL server via ODBC to analyze it with R. Instead, you will be able to take your R code to the data, where it will be run inside a sandbox process within SQL Server itself. This eliminates the time and storage required to move the data, and gives you all the power of R and CRAN packages to apply to your database." It'll no doubt intrigue Data Scientist types, but the devil's in the final details, which Microsoft was still cagey about when it talked-the-not-exactly-glitch-free-talk (starts @57:00) earlier this month at Ignite. So, brush up your R, kids, and you can see how Microsoft walks the in-database-walk when SQL Server 2016 public preview rolls out this summer.

Oculus Rift Hardware Requirements Revealed, Linux and OS X Development Halted 227

Posted by Soulskill
from the sad-penguin dept.
An anonymous reader writes: Oculus has selected the baseline hardware requirements for running their Rift virtual reality headset. To no one's surprise, they're fairly steep: NVIDIA GTX 970 / AMD 290 equivalent or greater, Intel i5-4590 equivalent or greater, and 8GB+ RAM. It will also require at least two USB 3.0 ports and "HDMI 1.3 video output supporting a 297MHz clock via a direct output architecture."

Oculus chief architect Atman Binstock explains: "On the raw rendering costs: a traditional 1080p game at 60Hz requires 124 million shaded pixels per second. In contrast, the Rift runs at 2160×1200 at 90Hz split over dual displays, consuming 233 million pixels per second. At the default eye-target scale, the Rift's rendering requirements go much higher: around 400 million shaded pixels per second. This means that by raw rendering costs alone, a VR game will require approximately 3x the GPU power of 1080p rendering." He also points out that PC graphics can afford a fluctuating frame rate — it doesn't matter too much if it bounces between 30-60fps. The Rift has no such luxury, however.

The last requirement is more onerous: WIndows 7 SP1 or newer. Binstock says their development for OS X and Linux has been "paused" so they can focus on delivering content for Windows. They have no timeline for going back to the less popular platforms.

Baidu's Supercomputer Beats Google At Image Recognition 115

Posted by samzenpus
from the all-the-better-to-see-you-with dept.
catchblue22 writes: Using the ImageNet object classification benchmark, Baidu’s Minwa supercomputer scanned more than 1 million images and taught itself to sort them into about 1,000 categories and achieved an image identification error rate of just 4.58 percent, beating humans, Microsoft and Google. Google's system scored a 95.2% and Microsoft's, a 95.06%, Baidu said. “Our company is now leading the race in computer intelligence,” said Ren Wu, a Baidu scientist working on the project. “I think this is the fastest supercomputer dedicated to deep learning,” he said. “We have great power in our hands—much greater than our competitors.”

Is Agile Development a Failing Concept? 507

Posted by timothy
from the surely-you're-not-all-out-of-buzzwords dept.
Nerval's Lobster writes: Many development teams have embraced Agile as the ideal method for software development, relying on cross-functional teams and adaptive planning to see their product through to the finish line. Agile has its roots in the Agile Manifesto, the product of 17 software developers coming together in 2001 to talk over development methods. And now one of those developers, Andy Hunt, has taken to his blog to argue that Agile has some serious issues. Specifically, Hunt thinks a lot of developers out there simply aren't adaptable and curious enough to enact Agile in its ideal form. 'Agile methods ask practitioners to think, and frankly, that's a hard sell,' Hunt wrote. 'It is far more comfortable to simply follow what rules are given and claim you're 'doing it by the book.'' The blog posting offers a way to power out of the rut, however, and it centers on a method that Hunt refers to as GROWS, or Growing Real-World Oriented Working Systems. In broad strokes, GROWS sounds a lot like Agile in its most fundamental form; presumably Hunt's future postings, which promise to go into more detail, will show how it differs. If Hunt wants the new model to catch on, he may face something of an uphill battle, given Agile's popularity.

Wireless Charging Tech Adopted By Ford, Chrysler, and Toyota Goes Open Source 75

Posted by timothy
from the cautious-optimism dept.
An anonymous reader writes: The in-vehicle wireless charging technology adopted by Ford, Chrysler, Dodge, RAM, and Toyota has been released to the public domain without royalties or licenses. This technology that you probably never heard of before is in 12 vehicles; more vehicles than all the other wireless charging standards combined. The open standard web page shows schematics, app notes, and certification information to get companies to make compatible wireless charging products.

Intel NUC5i7RYH Broadwell Mini PC With Iris Pro Graphics Tested 80

Posted by timothy
from the why-pay-for-big-any-more? dept.
MojoKid writes: In addition to ushering in a wave of new notebooks and mobile devices, Intel's Broadwell microarchitecture has also found its way into a plethora of recently introduced small form factor systems like the company's NUC platform. The new NUC5i7RYH is a mini-PC packing a Core i7-5557U Broadwell processor with Iris Pro graphics, which makes it the most powerful NUC released to date. There's a 5th-gen Core i7 CPU inside (dual-core, quad-thread) that can turbo up to 3.4GHz, an Iris Pro 6100 series integrated graphics engine, support for dual-channel memory, M.2 and 2.5" SSDs, 802.1ac and USB 3.0. NUCs are generally barebones systems, so you have to build them up with a drive and memory before they can be used. The NUC5i7RYH is one of the slightly taller NUC systems that can accommodate both M.2 and 9.5mm 2.5 drives and all NUCs come with a power brick and VESA mount. With a low-power dual-core processor and on-die Iris Pro 6100-series graphics engine, the NUC5i7RYH won't offer the same kind of performance as systems equipped with higher-powered processors or discrete graphics cards, but for everyday computing tasks and casual gaming, it should fit the bill for users that want a low profile, out-of-the-way tiny PC.

Ask Slashdot: After We're Gone, the Last Electrical Device Still Working? 403

Posted by Soulskill
from the all-the-robots-that-killed-us dept.
Leomania writes: After watching a post-apocalyptic Sci-Fi short on YouTube (there are quite a few) and then having our robot vacuum take off and start working the room, I just wondered what would be the last electric/electronic device still functioning if humans were suddenly gone. I don't mean sitting there with no power but would work if the power came back on; rather, something continuously powered, doing the task it was designed for. Are we talking a few years, decades, or far longer?
Data Storage

Enterprise SSDs, Powered Off, Potentially Lose Data In a Week 184

Posted by timothy
from the other-side-of-solid-state's-speed dept.
New submitter Mal-2 writes with a selection from IB Times of special interest for anyone replacing hard disks with solid state drives: The standards body for the microelectronics industry has found that Solid State Drives (SSD) can start to lose their data and become corrupted if they are left without power for as little as a week. ... According to a recent presentation (PDF) by Seagate's Alvin Cox, who is also chairman of the Joint Electron Device Engineering Council (JEDEC), the period of time that data will be retained on an SSD is halved for every 5 degrees Celsius (9 degrees Fahrenheit) rise in temperature in the area where the SSD is stored. If you have switched to SSD for either personal or business use, do you follow the recommendation here that spinning-disk media be used as backup as well?

Transformer Explosion Closes Nuclear Plant Unit North of NYC 213

Posted by timothy
from the could-happen-to-anyone dept.
Reuters reports that a transformer failure and related fire have forced the closure of a generating unit of the Indian Point nuclear plant, about 40 miles north of New York City; another generator at the same facility was unaffected. Witnesses reported seeing an explosion, as well as (according to NBC News) a "huge ball of black smoke" when the transformer exploded, which led to the shut-down of the site's Unit 3. The Reuters article says the plant "has long been controversial because of its proximity to the United States' largest city. Indian Point is one of 99 nuclear power plants licensed to operate in the United States and which generate about 20 percent of U.S. electricity use, according to the U.S. Nuclear Regulatory Commission website.

MIT Report Says Current Tech Enables Future Terawatt-Scale Solar Power Systems 176

Posted by timothy
from the more-power-to-you dept.
Lucas123 writes: Even with today's inefficient wafer-based crystalline silicon photovoltaics, terawatt-scale solar power systems are coming down the pike, according to a 356-page report from MIT on the future of solar energy. Solar electricity generation is one of "very few low-carbon energy technologies" with the potential to grow to very large scale, the study states. In fact, solar resources dwarf current and projected future electricity demand. The report, however, also called out a lack of funds for R&D on newer solar technology, such as thin-film wafers that may be able to achieve lower costs in the long run. Even more pressing than the technology are state and federal policies that squelch solar deployment. For example, government subsidies to solar are dwarfed by subsidies to other energy sources, and trade policies have restricted PV module and other commodity product imports in order to aid domestic industry. Additionally, even though PV module and inverter costs are essentially identical in the United States and Germany, total U.S.residential system costs are substantially above those in Germany.