Troll Kingdom

This is a sample guest message. Register a free account today to become a member! Once signed in, you'll be able to participate on this site by adding your own topics and posts, as well as connect with other members through your own private inbox!

Gems from Science

I often try to visualise THE UNIVERSE in my mind and it ends up blowing my brain.

WHY DID THE BIG BANG HAPPEN.
 
I was on Twitter and saw this tweet

Slate ‏@Slate 29m
Insane photo of a mesocyclone over Colorado looks like a UFO--PHOTOS: Mesocyclone: Photo of storm cell wins Nat Geo photo contest. (I put the long link in, because the short link doesn't copy & paste OR SOMETHING FUCK)

Anyway, the article is really cool and the picture on the article is really cool, but I am too lazy to rehost the image and I don't know if Slate lets you hotlink images so :rwmad:

A supercell is a rotating thundercloud; the spinning vortex in the middle is called a mesocyclone. Conditions need to be just so to create one. First you need a wind shear, where wind blows faster in one spot than another, so a blanket of air is flowing over another one. This sets up a rolling vortex, a horizontally rotating mass of air like the way a wave breaks when it gets to a beach. An updraft then lifts that vortex, which then spins vertically.

The warmer air in the vortex rises; this is called convection. If there’s a boundary layer of air above it, called a capping layer, it acts like a lid, preventing the vortex air from rising. It builds up power and can suddenly and explosively grow to a huge size.


I found these on tumblr, they're not of the same picture that won the contest in the Slate article but they're good. I also found a new tumblr blog to follow, because I like pictures of clouds and shit.


11339.imgcache.gif


11340.imgcache.jpg


11341.imgcache.jpg


11342.imgcache.gif
 
Scientists just created a real-life Transformer - Vox

This self-assembling, origami-inspired robot is the creation of engineers from Harvard and MIT, and was revealed in a study published today in Science. It transforms when its circuit is turned on, delivering heat to various spots in the robots' joints, which causes the memory plastic to fold, lifting it up from the ground over the course of about for minutes (it's been sped up in the GIF above). Two motors then allow it to scuttle away on four legs at a speed of around 500 feet per hour.

The engineers made the robot with an algorithm that could be used to create a variety of complex 3D objects from a flat material. With it, they say, we could design robots to be shipped out to all sorts of situations in a flat, packable shape, then get up on their own and do useful things — say, collect environmental data, or map a remote comet.

Optimus Prime, alas, still appears to be several years away.

11349.imgcache.gif
 
Ethical trap: robot paralysed by choice of who to save - 14 September 2014 - New Scientist

CAN we teach a robot to be good? Fascinated by the idea, roboticist Alan Winfield of Bristol Robotics Laboratory in the UK built an ethical trap for a robot – and was stunned by the machine's response.

In an experiment, Winfield and his colleagues programmed a robot to prevent other automatons – acting as proxies for humans – from falling into a hole. This is a simplified version of Isaac Asimov's fictional First Law of Robotics – a robot must not allow a human being to come to harm.

At first, the robot was successful in its task. As a human proxy moved towards the hole, the robot rushed in to push it out of the path of danger. But when the team added a second human proxy rolling toward the hole at the same time, the robot was forced to choose. Sometimes, it managed to save one human while letting the other perish; a few times it even managed to save both. But in 14 out of 33 trials, the robot wasted so much time fretting over its decision that both humans fell into the hole. The work was presented on 2 September at the Towards Autonomous Robotic Systems meeting in Birmingham, UK.

Winfield describes his robot as an "ethical zombie" that has no choice but to behave as it does. Though it may save others according to a programmed code of conduct, it doesn't understand the reasoning behind its actions. Winfield admits he once thought it was not possible for a robot to make ethical choices for itself. Today, he says, "my answer is: I have no idea".

As robots integrate further into our everyday lives, this question will need to be answered. A self-driving car, for example, may one day have to weigh the safety of its passengers against the risk of harming other motorists or pedestrians. It may be very difficult to program robots with rules for such encounters.

But robots designed for military combat may offer the beginning of a solution. Ronald Arkin, a computer scientist at Georgia Institute of Technology in Atlanta, has built a set of algorithms for military robots – dubbed an "ethical governor" – which is meant to help them make smart decisions on the battlefield. He has already tested it in simulated combat, showing that drones with such programming can choose not to shoot, or try to minimise casualties during a battle near an area protected from combat according to the rules of war, like a school or hospital.

Arkin says that designing military robots to act more ethically may be low-hanging fruit, as these rules are well known. "The laws of war have been thought about for thousands of years and are encoded in treaties." Unlike human fighters, who can be swayed by emotion and break these rules, automatons would not.

"When we're talking about ethics, all of this is largely about robots that are developed to function in pretty prescribed spaces," says Wendell Wallach, author of Moral Machines: Teaching robots right from wrong. Still, he says, experiments like Winfield's hold promise in laying the foundations on which more complex ethical behaviour can be built. "If we can get them to function well in environments when we don't know exactly all the circumstances they'll encounter, that's going to open up vast new applications for their use."

[youtube]jCZDyqcxwlo[/youtube]

STOP PARALYZING THE ROBOTS :rwmad:
 
[video=youtube;sLDSNvQjXe8]http://www.youtube.com/watch?v=sLDSNvQjXe8[/video]

Someone should have told her that "composed" and "comprised" aren't synonyms, though.
 
Top