November 25, 2010

Maf Event Bus

MAF v3 Framework (Part 2)
The cross-platform MAF Event Bus make use of Qt Signal/Slot to dispatch events between objects locally and remotely. Qt mechanism has been extended to manage also a remote communication transparently to the EventBus users and decouple the sender from the receiver allowing sending events remotely using different network connectors expandable through the strategy pattern.
Other information on MAF v3 framework can be found here: MAF Site
Enhanced by Zemanta

November 16, 2010

Interface Design Basics

Interesting presentation on Interface Design:

April 28, 2010

f8 Conference

Today I watched Mark Zuckerberg presenting at the f8 Conference (recorded on 21st April 2010).
It was very interesting to listen on how Internet could become with connections between different social network. We could have a better social experience and retrieve information from the Internet in a manner that is tailored just for us.

Enjoy the movie and thanks for reading :)

Watch live streaming video from f8conference at livestream.com
Reblog this post [with Zemanta]

March 3, 2010

Touch screen on skin

Researchers at Microsoft are working on ways to bring about any type of surface (in this case the skin) touch screen technology.
To achieve this goal, the researchers used a pico-projector and a special microphone that allows you to intercept the pressure on the skin through your fingers. The system is called Skinput.

The projector can display the user interface, while the microphone determines the position based on the physical principle of transmission of waves transmitted through the body.
The video below shows how to create touch screen interface using the arm of a person as a "screen" touch.




Thanks for reading :)

Reblog this post [with Zemanta]

February 15, 2010

Man Machine integration

Kevin Warwick in this video talk about an interesting point of view about man-machine integration as a way to expand our knowledge and potentials.
Human race can be more powerful and can see things in different way and with different perspectives if it accepts a mix artificial intelligence, robotics and implants.
Take a look at this fascinating video of professor Warwick at University of Reading's Cybernetics.


Thanks for reading and leave your comments :)

February 10, 2010

WiLink 7.0 Texas Instruments: WiFi, GPS, BlueTooth and FM all together

Texas Instruments has presented the new chip that will solve most problems of wireless connectivity for mobile devices: WiLink 7.0.
This is one of the first quad-chip for mobile devices that integrate WiFi (IEEE 802.11n), Bluetooth, GPS and Radio FM in a single chip so to decrease the costs of 30% and space occupied of the 50%.

Devices will have wireless capabilities without loosing performances. Analysts foreseen that within 2013 will be sold more then 4.5 billions of radio-combined chips.

Thanks for reading :)

January 25, 2010

Electric cars are becoming more of a concept

Recently we witnessed the completion of the VOLTA by the GM, Fiat 500 and now the Venturi Voltage concept car presented at the NAIAS 2010 (image below).



All these cars have in common the use of more or less efficient batteries to power the electric motor.
I think the real revolution will happen when you do not think only to accumulate energy, but also how to produce it. A car can increase its autonomy in a much greater than the average current only if it is equipped with systems of energy production. Examples could be producing energy from the sun, the use of air currents that are formed during the movement of the car itself and the conversion into electricity of heat developed by electric motors.

I believe that technological innovation in this sector is just beginning and maybe this could lead to something completely innovative rethinking the concept of mobility.

Thanks for reading and leave your comments.

January 10, 2010

Augmented reality and aerial drone. There's an app also for that.



Parrot has been demonstrating at the 2010 CES show in Las Vegas the AR.Drone, a Wi-Fi helicopter with dual cameras and augmented-reality video streaming, that you control using your iPhone or iPod Touch. The AR.Drone features four rotors and interchangeable hulls for flying both indoors and outside. Built-in flight stabilization technology keeps the drone steady while you use your iPhone’s motion sensors to steer it remotely over the craft’s Wi-Fi network.
Parrot spent four years developing the AR.Drone and creating an augmented reality gaming platform. Using its streaming video camera, the AR.Drone image processing can detect other drones or 3D targets. Several demo games are on display, but Parrot hopes that game developers will take advantage of their open API to develop more games and other applications for the AR.Drone.
To make the AR.Drone easy to fly, Parrot developed a microelectromechanical (MEMS) inertial guidance system that includes a three-axis accelerometer, a two-axis gyroscope, and a single-axis precision gyroscope for yaw. The flyer also includes an ultrasonic altimeter and a down-facing video camera for calculating speed and position. By tilting your iPhone or iPod Touch you can drive the drone. The built-in Wi-Fi network is used to establish the connection between the drone and the iPhone, then the video camera on the drone streams its feed directly to the screen on your iPhone. If you remove your finger from the iPhone, the AR.Drone’s autopilot keeps the drone hovering about a meter off the ground. If the network connection is lost, the autopilot will stabilize the drone and slowly lower it to the ground for a soft landing.
Parrot hopes to make the AR.Drone available in the second half of 2010. For more information visit ardrone.parrot.com. For details about the open API, visit projects.ardrone.org.




Thanks for reading :)