Seiteninhalt Hauptmenü Portalmenu Seitenmenü Schriftgröße ändern Breadcrump Index Suche
PC

Sie sind hier:

Seite drucken

Schriftgröße ändern

Seiteninhalt

Projects


 
CROSMOS - Cooperation, Resource-Optimization and Self-Organization in Mobile, Mixed-Reality Environments
Available01.2014 - 06.2015
ContactUniv.-Prof. Dipl.-Ing. Dr. techn. Bernhard Rinner
E-Mail
Phone+43(0)463 2700 3670
Fax+43(0)463 2700 3679
Description 

The CROSMOS project explores cooperative, multi-user apps and investigates methods for self­-organization in technical and social environments. We enhance the apps with elements to measure the social interaction patterns among the users. A key goal in CROSMOS is to model, predict and influence the social interaction, which is important on the one hand to improve our understanding of the social behavior in mixed-reality environments and on the other hand to improve the design and resource utilization of mobile apps. User interactions have been intensively studied in social networks. However, the holistic investigation of self-organization in technical and social environments is a new field of research where we expect to contribute with our our innovative Social Analytics approach, where not only the app usage is investigated, but also the social impact and the communication beyond the device. As a use case we develop a multi-user, mobile game for our project Partner, the Klagenfurter Stadtwerke.

 

CROSMOS Website

Anfang


SINUS - Self-organizing Intelligent Networks of UAVs
Available01.2013 - 06.2015
ContactUniv.-Prof. Dipl.-Ing. Dr. techn. Bernhard Rinner
E-Mail
Phone+43(0)463 2700 3670
Fax+43(0)463 2700 3679
Description

 

Autonomous unmanned aerial vehicles (UAVs) are used with increasing interest in civil and commercial applications and by the scienti c research community. Small-scale multicopters are of particular interest due to their ease of deployment, high maneuverability, and low costs. Recently, research and development e ort has shifted towards using teams of UAVs for monitoring, surveillance, or disaster assistance, to name a few applications. However, the design principles of such a multi-UAV system still need intensive investigations and remain an open research problem.

We can abstract a multi-UAV system by four key components: (i) the multiple UAV platforms, (ii) the sensing component which analyzes the captured data, (iii) the aerial network which provides wireless networking functionality among the UAVs and the ground station, and (iv) the coordination component which organizes the individual tasks of the UAVs to achieve a common mission goal. The SINUS project focuses on the integration of these components and their interaction to e ectively close the sensing-networking-acting loop within the multi-UAV system. Such a tight integration is necessary for deploying self-organizing UAVs in dynamic and partly unknown environments.

SINUS Website

Anfang


TrustEYE - Trustworthy Sensing and Cooperation in Visual Sensor Networks
Available08.2012 - 07.2015
ContactUniv.-Prof. Dipl.-Ing. Dr. techn. Bernhard Rinner & Dipl.-Ing. Dr. techn. Thomas Winkler
E-Mail
Phone+43(0)463 2700 3672
Fax+43(0)463 2700 3679
Description


The emerging research area of visual sensor networks (VSNs) combines concepts from sensor networks, embedded computing and computer vision. In many VSN applications, sensitive personal data is captured and analyzed. There do exist a few, partial approaches towards security and privacy protection in VSNs, but systematically establishing a secure and privacy-preserving VSN is still an open research question. The fundamental hypothesis of this research is that trust in resource-limited VSNs can be established by making security and privacy protection inherent properties of the image sensing unit. The key idea is to “protect” access to the sensor and encapsulate dedicated security and privacy functionality in a TrustEYE—a secure sensing unit embedded on the smart camera. The TrustEYE has exclusive access to image sensor's raw data. It separates sensitive from non-sensitive data by applying dedicated image analysis and ensures that only non-sensitive data is made available to the camera host system. Furthermore, the TrustEYE provides integrity, authenticity, freshness and timestamping guarantees based on cryptographic techniques. This approach clearly separates privacy protection and security functionality from application code. Application developers only get access to pre-processed and protected data. Thus, security and privacy protection are no longer in the sole responsibility of application developers.


 TrustEYE Website

Anfang


ICE Booster - Erasmus Mundus Joint Doctorate in Interactive and Cognitive Environments
Available01.2011 - 12.2014
ContactUniv.-Prof. Dipl.-Ing. Dr. Bernhard Rinner / Univ.-Prof. Dr.-Ing. Christian Bettstetter
E-Mail
Phone+43(0)463 2700 3670
Fax+43(0)463 2700 3670
Description

Anfang


ICE - Erasmus Mundus Joint Doctorate in Interactive and Cognitive Environments
Available01.2011 - 12.2015
ContactUniv.-Prof. Dipl.-Ing. Dr. techn. Bernhard Rinner
E-Mail
Phone+43(0)463 2700 3670
Fax+43(0)463 2700 3679
Description


The Joint Doctoral Programme in Interactive and Cognitive Environments offers PhD candidates an education programme in the field of research related to computer science, electronic and telecommunication engineering and industrial design. Candidates must hold a Master of Science (M.Sc.) degree or equivalent title and in-depth knowledge and understanding of the principles of ICT engineering. ICE aims at developing and enhancing master students’ knowledge and skills in order to shape a new generation of professionals able to exploit (and further enhance) cutting-edge ICT technologies to design and implement - in multi-disciplinary work teams - innovative solutions in the ever more pervasive fields of application. The programme has three major focuses, that will be iteratively pursued during the whole progress of the candidate’s track. The former concerns the acquisition and the formalization of knowledge in specific advanced domains, achieved through lectures and seminars. The second aims at keeping PhD students in strict touch with leading research groups that have proven experiences in research activities in basic disciplines necessary for the PhD course in the five partner universities. The latter involves actual industry/academy joint research activities on projects in cooperation with leading ICT companies, typically under international institutional umbrellas, such as the European Research Frameworks.

 

ICE Website

Anfang


EPiCS - Engineering Proprioception in Computing Systems
Available09.2010 - 09.2014
ContactUniv.-Prof. Dipl.-Ing. Dr. techn. Bernhard Rinner
E-Mail
Phone+43(0)463 2700 3670
Fax+43(0)463 2700 3679
Description


Eight partners - Univ. Paderborn (coordinator), Imperial College London, Univ. Oslo, Univ. Birmingham, EADS Munich, ETH Zurich, AIT Vienna and Klagenfurt University - will perform research on self-awareness in computing systems. The EPiCS project aims at laying the foundation for engineering the novel class of proprioceptive computing systems. Proprioceptive computing systems collect and maintain information about their state and progress, which enables self-awareness by reasoning about their behaviour, and self-expression by effectively and autonomously adapt their behaviour to changing conditions. The Pervasive Computing group will contribute research in the self-organization of visual sensor networks.

 

EPiCS Website

Anfang


Mobi Trick - Mobile Traffic Checker
Available05.2010 - 08.2013
ContactUniv.-Prof. Dipl.-Ing. Dr. techn. Bernhard Rinner
E-Mail
Phone+43(0)463 2700 3670
Fax+43(0)463 2700 3679
Description


Mobi Trick is a joint research project together with Graz University of Technology and Efkon AG Graz. The goal is to develop a portable device for traffic enforcement (e.g., tolling). Mobile Systems are limited in size and resources. Therefore, there is not much space for a large number of sensors. Hence, the work in this project will focus on stereo vision but with two different types of cameras (e.g., color and infrared). The system needs to adapt itself to changing situations in case it is used at a new location. This requires adaptive calibration and online learning. Energy is always a scarce resoure in embedded devices. Thus the system must be designed to be very energy efficient. New approaches for context-aware dynamic power management will be explored.

 

Mobi Trick Website

Anfang


SRSnet - Smart Resource-Aware Multi-Sensor Network
Available09.2009 - 09.2012
ContactUniv.-Prof. Dipl.-Ing. Dr. techn. Bernhard Rinner
E-Mail
Phone+43(0)463 2700 3670
Fax+43(0)463 2700 3679
Description


The SRSnet project focuses on the design of a smart resource-aware multi-sensor network capable of autonomously detecting and localizing various events such as screams, animal noise, tracks of persons and more complex human behaviors. The project's research areas include (i) collaborative audio and video analysis, (ii) complex event detection and (iii) network reconfiguration. The SRSnet will be demonstrated in an environmental case study at the Hohe Tauern National Park.

This project is joint work with the Institute of Smart Systems Technologies at Klagenfurt University, the University of Udine and Lakeside Labs. It is funded by the European Interreg 4 Fund and the Carinthian Economic Promotion Fund.

 

SRSnet Website

Anfang


CLIC - Closed Loop Integration of Cognition, Communication and Control
Available01.2009 - 12.2010
ContactUniv.-Prof. Dipl.-Ing. Dr. techn. Bernhard Rinner
E-Mail
Phone+43(0)463 2700 3670
Fax+43(0)463 2700 3679
Description

The objective of the CLIC (Closed-Loop Integration of Cognition, Communication and Control) project is to integrate real-time image analysis, adaptive motion control and synchronous communication. The combination of this scopes enables innovative control and security mechanisms as well as improved energy efficiency. The sample application will be a crane that can autonomously avoid collisions with stationary and moving objects. The combination of these components in real-time enables the crane to load a vehicle while this is in motion.
 

Anfang


SOMA - Self Organizing Multimedia Architecture
Available12.2008 - 01.2012
ContactUniv.-Prof. Dipl.-Ing. Dr. techn. Bernhard Rinner
E-Mail
Phone+43(0)463 2700 3670
Fax+43(0)463 2700 3679
Description


The project Self-organizing Multimedia Architecture (SOMA) aims to capture the whole life-cycle of multimedia content in a single architecture for large distributed multimedia information systems.  In SOMA we focus on scenarios where events, which we understand as limited time periods of special importance, are a central concept. Within the project we investigate the behavior of small but efficient (computing) units working self contained but collaboratively.  A network of smart sensors reports events to a distribution network captured in multimedia data units.  In the distribution network events are analyzed, processed, stored, and prepared for delivery.  Events and related continuous data are either pushed to users on a subscription basis or consumed by users based on pull mechanisms.  Based on the consumption behavior and user feedback, popularity and relevance of delivered content is assessed and reported to the distribution and the sensor networks.SOMA is a joint research project of the Institute of Information Technology, the Institute of Networked and Embedded Systems as well as ASFiNAG Mautservice.

 

SOMA Website

Anfang


McDAV - Multi-camera Data Aggregation and Visualization
Available05.2008 - 09.2009
ContactUniv.-Prof. Dipl.-Ing. Dr. techn. Bernhard Rinner
E-Mail
Phone+43(0)463 2700 3670
Fax+43(0)463 2700 3679
Description


Smart cameras perform image analysis onboard and deliver the abstracted data.  By combining data delivered from multiple cameras observing the same scene we can further increase the usefulness of smart camera networks.  An important goal for such multi-camera systems is to resolve object occlusions by aggregating views from different angles. The aim of this research is to develop a data aggregation and visualization system which is able to combine the high-level output of smart cameras to form a three-dimensional model of a scene. First, the system collects the high-level frame description from each camera. Second, all objects in those descriptions are localized using the visual angles collected in the single views. Finally, the complete scene is visualized in a 3D-model. This research is conducted in cooperation with Austrian Research Centers Seibersdorf.

 

Anfang


cDrones - Collaborative Microdrones
Available04.2008 - 12.2012
ContactUniv.-Prof. Dipl.-Ing. Dr. techn. Bernhard Rinner
E-Mail
Phone+43 (0)463 2700 3670
Fax+43 (0)463 2700 3679
Description


Microdrones are small-scale unmanned aerial vehicles carrying payloads such as cameras and sensors. This project develops a system for aerial sensing based on cooperating, wireless networked microdrones. Several microdrones will fly in formation over the area of interest in a selforganizing manner and deliver high-quality sensor data such as images or videos. These images are fused on the ground, analyzed in real-time, and delivered to the user. The project will perform original research in the areas (1) flight formation, (2) mission planning and control, and (3) sensor data interpretation, and it will demonstrate a collaborative microdrone system for fire response operations.

 

cDrones Website

Anfang


Pervasive Smart Cameras
Availableongoing
ContactUniv.-Prof. Dipl.-Ing. Dr. techn. Bernhard Rinner
E-Mail
Phone+43 (0)463 2700 3670
Fax+43 (0)463 2700 3679
Description


Distributed smart cameras are real-time distributed embedded systems that perform computer vision using multiple cameras.  This new approach is emerging thanks to a confluence of demanding applications and the huge computational and communications abilities enabled by Moore’s Law.  This interdisciplinary field builds upon techniques from computer vision, distributed computing, and embedded computing and sensor networks. In this strategic research area we focus on several aspects of distributed smart cameras such as distributed resource management, collaborative image processing, sensor fusion and multimedia.  We develop prototypes of distributed smart camera systems and apply them in case studies such as traffic monitoring and security.


 

Anfang


EVis - Autonomous Traffic Monitoring by Embedded Vision
Available04.2007 - 06.2010
ContactUniv.-Prof. Dipl.-Ing. Dr. techn. Bernhard Rinner
E-Mail
Phone+43 (0)463 2700 3670
Fax+43 (0)463 2700 3679
Description


The world will witness a tremendous increase in the number of vehicles in the near future. Future traffic monitoring systems will therefore play an important role to improve the throughput and safety of roads. Current monitoring systems capture (usually vision-based) traffic data from a large sensory network; however, they require continuous human supervision which is extremely expensive.

In the EVis research project we investigate the scientific and technological foundations for future autonomous traffic monitoring systems. Autonomy is achieved by a novel combination of three approaches: First, vision-based detection and classification methods are augmented by self-learning and scene adaptation mechanisms which will significantly reduce the effort of manual configuration. Second, visual data is fused with data from other sensors such as radar, infrared or inductive loop sensors. Sensor fusion helps to improve the robustness and confidence, to extend the spatial and temporal coverage as well as to reduce the ambiguity and uncertainty of the processed sensor data. Finally, the developed vision and fusion methods are implemented on a distributed embedded platform which makes them wider applicable and supports real-time operation.

The technological output of the EVis project allows customers to take advantage of traffic management solutions that are easier to install, easier to operate and maintain, have a higher level of robustness. Furthermore, this technology enables multi-task operations in a single system capable of traffic monitoring, vehicle identification and classification, incident detection, traffic rule enforcement, and observation of (critical) driver's behavior.

The industrial partner EFKON is a globally acting company and expects EU, Middle East, North America, and Asia as primary markets for this technology. The midterm market potential is predicted as significantly over 100 million Euros.

 

EVis Website

Anfang


 
 
© 2009 Alpen-Adria-Universität Klagenfurt | Publication details | Contact | Disclaimer
Responsibility for content: TEWI - Web Administrator
Requests to: TEWI - Web Editor

Sprachauswahl