♫MUSIC♫
DENA HEADLEE: Deaf students are often left in the
dark at planetariums. Here's the scene...a deaf person is sitting
in a planetarium looking up at the stars...they can't see an
American sign language interpreter's narration. When
they look down to follow what the interpreter is saying, they
can't see what's going on above them. This was the scene seven
years ago when a summer camp for the deaf visited a planetarium
at Brigham Young University where an idea was born. Computer
science engineer Mike Jones and his NSF-funded team at Brigham
Young University developed SignGlasses. A system and
software to project sign language narration onto several
types of glasses--including Google glass. The system moves
the interpreter in front of the users eyes. After testing the
software app on different devices, the team designed their
first planetarium app. Jones hopes this system will make its
way into classrooms and labs to help deaf students become more
engaged in everything from English class to small group
learning with hearing students. On May 20, 2013, this tornado cut
a 17-mile path across Moore, Oklahoma, killing 24 people and
causing approximately 2 billion dollars in damages. NSF-funded
researchers at the University of Oklahoma hope to save lives by
better understanding how debris interacts with
these deadly tornados.
ROBERT PALMER: Tornadic debris actually causes
most of the tornado related deaths, so because debris is
lofted up when the tornado touches the ground and interacts
with buildings, then you can imagine the speed at which debris is
flying, and it causes most of the deaths. So understanding how
debris interacts with the tornado is just fundamental
to saving lives.
DENA HEADLEE: Studying radar signatures, Robert Palmer
and his team observed the May 20th tornado using
specially equipped dual-polarization radar able to
capture the unique radar signature produced by debris
lofted in tornadoes.
ROBERT PALMER: ...debris has a particular shape
and orientation. The signature that you
see on the radar is unique. It's different from what you see
from weather like raindrops and hail stones. So from that we can
actually use that signature to better detect tornadoes. We
might be able to relate it to storm damage.
DENA HEADLEE: This isn't the only tool Palmer has.
OU's new Anechoic Chamber can test and measure different
types of debris in a controlled environment.
ROBERT PALMER: We make measurements out in the
field, we can't always visually see what the debris is doing
physically, but in here we can control the debris and then we
can see exactly what the measurements should look like.
Then we can compare those to the real world. So this is
fundamentally important, we couldn't do the research
without this chamber.
DENA HEADLEE: Palmer and his team hope their new
research tools result in more accurate, more useful
information on tornadoes, ultimately saving lives. As more
people use smart phones and tablets to pay bills and make
purchases, password security has become critical. NSF-funded
engineers at Rutgers University discovered that sweeping fingers
in shapes across the screen of a tablet, can more safely unlock
phones and grant access to apps than traditional 4-pin
passwords, easily stolen by prying eyes. Janne Lindqvist and
his team tested this method on 63 participants. Each created a
gesture and recalled it again 10 days later. The gestures were
captured by a recognizer system designed by the team. Their
analysis showed that free-form gestures were more favorable as
passwords. Even the teams' savviest computer science and
engineering students were unable to replicate the gesture
password by shoulder surfing. While testing is still in its
preliminary stages, these gestures appear to be less
likely than traditional typed passwords to be stolen. This
isn't your standard Cadillac SRX. It is one of the most
advanced driverless vehicles ever designed, and recently
navigated Washington, D.C.'s 395 Inner Loop without a driver.
NSF-funded Carnegie Mellon University researchers brought
the experimental vehicle to Capitol Hill to demonstrate
the car's capabilities.
PETER ROGOFF: This is the cutting edge and
we need to embrace the cutting edge.
DENA HEADLEE: With over a decade of fundamental research
and development by scientists and engineers at CMU and
elsewhere, the technology is enabling remarkable advances.
CORA MARRETT: Just as the Internet transformed the way
people interact with information, what we call
cyber-physical systems are transforming the way in which
people interact with engineered systems and the environment.
DENA HEADLEE: Harnessing the most advanced cyber-physical
systems, the car is outfitted with sensors, computer vision,
artificial intelligence, control automation plus powerful
computer processing. This Cadillac packs a serious punch.
The system controls steering, speed and braking, and can
detect when obstacles are in the roadway.
BILL SHUSTER: Who would have envisioned, the founding
fathers, the First Congress that served here in Washington, that first came to Washington who
would have ever thought we would sitting here talking about
autonomous vehicles. So you know, George Jetson may
be a reality.
DENA HEADLEE: The team hopes their driverless vehicle
will some day decrease injuries and road fatalities. For more
information about these stories, visit us at NSF.gov. This is NSF
Science Now, I'm Dena Headlee.
♫MUSIC♫
Không có nhận xét nào:
Đăng nhận xét