1 V11: User Interaction in AmI Dr.-Ing. Reiner Wichert Fraunhofer-Institut für Graphische...

Post on 28-Mar-2015

214 views 0 download

Transcript of 1 V11: User Interaction in AmI Dr.-Ing. Reiner Wichert Fraunhofer-Institut für Graphische...

1

V11: User Interaction in AmI

Dr.-Ing. Reiner WichertFraunhofer-Institut für Graphische Datenverarbeitung IGD

Holger Graf

Fraunhofer-Institut für Graphische Datenverarbeitung IGD

Mohammad-Reza (Saied) Tazari

Fraunhofer-Institut für Graphische Datenverarbeitung IGD

Ambient IntelligenceWS 10/11

2

Gliederung

Abgrenzung des Themas

Eine Analyse

UI Beschreibungssprachen

Beispielmodelle & -ansätze

Das UI Framework von PERSONA

3

WIEDERHOLUNG AUS VORLESUNG #1 + ABGRENZUNG DES THEMAS

User Interaction in Smart Environments

44

Methodischer Ansatz

Änderung der Interaktion

1. Benutzerziel

2. Strategie

3. Ausführung1. Benutzerziel

2. Strategie

3. Ausführung

55

(2) Software-Architekturzur Koordination vonGeräten und Applikationen.

(1) Natürlichsprache Interaktion wird möglich.Multimodale Interaktion wird möglich.

(3) Interaktionsmodelle,Mentale Modelle,Inferenz-Mechanismen,Induktives Schließen.

Ambient Intelligence bündelt alle Bereiche der Informatik

(4) Geräte / Sensoren

6

“Heller!”

Vision

Quelle: EMBASSI

7

Implizite versus Explizite Interaktion (I)

• Implizit: Beobachtung der Aktionen des Benutzers, auch

wenn nicht im Sinne direkter Interaktion mit der Umgebung

• Erfassung der Handlungen und Teilung im System als

kontextuelle Ereignisse

• Analyse der sich ergebenden Situationen und Ableitung

möglicher Benutzerziele

• Ableitung möglicher Reaktionen durch die Umgebung

Teilaspekt von Vorlesungen #4 & #7

Nicht Gegenstand dieser Vorlesung

8

Implizite versus Explizite Interaktion (II)

• Explizit: Benutzer interagiert bewusst mit der Umgebung

• Entweder im Sinne direkter Anweisung, z.B.

• Benutzer: „Fenster im Schlafzimmer schließen!“

• Oder auf Rückfragen der Umgebung reagieren, z.B.

• System: „zu viel Rauch in der Küche; soll das Fenster

geöffnet werden?“

• Benutzer: „Ja!“

Gegenstand dieser Vorlesung

9

Explicit UI over I/O channels long enough in the shadow of “implicit

interaction” over sensing channels in AmI

Progresses that help explicit UI become more important

proliferation of (multi-)touch sensing, HD displays, & displays embedded in all

possible devices

new interaction forms supported by special devices (e.g. WiiMote)

qualitative progresses in

▫ speech recognition

▫ natural language processing

▫ gesture recognition (e.g., Kinect)

socio-political pressure on “accessibility for all”

The Importance of Explicit User Interaction

10

EINE ANALYSEUser Interaction in Smart Environments

11

The Pipe-Lines Model von Nigay & Coutaz (1997)

Qu

elle: citeseerx.ist.psu

.edu

/viewd

oc/su

mm

ary?d

oi=

10.1.1.144.7554

12

Intelligente Umgebungen mit vernetzten Geräten

13

Multiplizität in Intelligenten Umgebungen

Quelle: Dissertation von Marco Blumendorfhttp://opus.kobv.de/tuberlin/volltexte/2009/2325/pdf/blumendorf_marco.pdf

14

living room TV

sleeping room TV

a display in the entrance

a display integrated in the fridge door

mirrors capable of becoming displays

microphone arrays installed in all rooms

loudspeakers installed in all rooms

phones providing displays, microphones, (loud)speakers

hi-fi providing loudspeakers

.

.

An infrastructure of available I/O channels

I/O Devices in emerging Smart Homes

15

Smart Environments as Open Distributed Systems

16

The Consequence

I/OInfrastructure

I/OInfrastructure

OpenDistributed

System

OpenDistributed

System

17

Begriffe

1. Nach Blumendorf

▫ Quelle:

opus.kobv.de/tuberlin/volltexte/2009/2325/pdf/blumendorf

_marco.pdf

2. Nach Tazari

▫ Quelle:

www.gris.tu-darmstadt.de/teaching/courses/ws1011/ambi

ent/slides/PERSONA_Architektur_Manual.pdf

18

Begriffe nach Blumendorf

1. Interaction Resource (IR)

Atomic (one-way, single-modality) I/O channel exploitable by

a user for executing a task. E.g., keyboards, mice, screens,

speakers, microphones, or cameras

2. Interaction Device (ID)

Computing systems that handle the input of or send output

to individual IRs connected to it. Hence, an ID is a

collection of IRs together with the computing unit. It

comprises the hardware used for the interaction (e.g.

screen, keyboard, touch-pad) as well as a software

platform for communication and presentation tasks.

19

Begriffe nach Tazari

Channel: Smart environments need to bridge between the physical world and the

virtual realm with the help of certain devices. Channel denotes the bridging

passage provided by such devices between the physical world and the virtual

realm. Depending on the kind of channel opened, a channel might be called a

sensing channel (provided by sensors), an acting channel (provided by actuators),

an input channel (provided by microphones, keyboards, etc.), or an output channel

(provided by displays, loudspeakers, etc.). The latter two types of channels might

be referred to as I/O Channels.

I/O Device: An abbreviation for input and / or output device. A device that provides

an input and / or output channel for facilitating explicit interaction between a smart

environment and its human users. Input devices, such as a microphone, a

keyboard, or a mouse, can capture an instruction or response that is provided by a

human user and represent it in terms of data in the virtual realm. Upon receive of

data within the virtual realm that is intended to be presented to human users,

output devices, such as displays and loudspeakers, can make it perceivable to the

addressed humans.

20

Recall Concept Maps from V3

21

Anforderungen

1. Nach Blumendorf

▫ Quelle:

opus.kobv.de/tuberlin/volltexte/2009/2325/pdf/blumendorf

_marco.pdf

2. Nach Tazari

▫ Quelle: www.springerlink.com/content/5l3685543k4v8524/

22

Anforderungen nach Blumendorf

• Shapeability to address different layouts for users, device

capabilities and usage contexts,

• distribution across multiple interaction devices,

• multimodality to support various input and output modalities,

• shareability between multiple users,

• mergability and interoperability of different applications.

Im Grunde alles verschiedene Aspekte der Adaptibilität

23

Adaptibilitätsanforderungen nach Blumendorf: Shapeability

Layout change depending onuser distance to the screen

24

Adaptibilitätsanforderungen nach Blumendorf: Distribution

user interface can be distributed across multipleinteraction devices and kept continuously synchronized

25

Adaptibilitätsanforderungen nach Blumendorf: Multimodality

user is able to utilize multiple interaction resources and modalities including voice, touch and gesture simultaneously

26

Adaptibilitätsanforderungen nach Blumendorf: Shareability

Two users sharing applications

27

Adaptibilitätsanforderungen nach Blumendorf: Mergability

UI of a cooking assistant embedded in a meta-UI controlling interaction parameters

28

Anforderungen nach Tazari

Separating I/O channel management from applications

Modality- / layout-neutral dialogs

Brokerage mechanisms

Support for adaptive dialogs

Task division between layers

Availability of user context, capabilities, and preferences to all layers

Handling input & output

Modality fusion & fission

Context-free input

29

UI BESCHREIBUNGS-SPRACHEN

User Interaction in Smart Environments

30

Need for Declarative Languages

• A direct consequence of separating application layer from the

presentation layer

e.g., www.google.come.g., Firefox e.g.,language = HTMLprotocol = HTTP

31

The problem with HTML

• Not really modality-neutral

• Sometimes posing certain layout

More abstract and neutral languages investigated since more than 10

years:

UIML

TERESA XML

UsiXML

SMIL

EMMA

XISL

XForms

32

XForms - Separation of Values from Controls

There are two parts to the essence of XForms. The first is to separate

what is being returned from how the values are filled in:

• The model specifies the values being

collected (the instance), and their related

logic

• Types, restrictions

• Initial values, Relations between

values

• The body of the document then binds

forms controls to values in the instance

Quelle: www.w3.org/2006/Talks/05-26-steven-XForms/

33

XForms – Intent-based Controls

Quelle: www.w3.org/2006/Talks/05-26-steven-XForms/

34

BEISPIEL-MODELLE& -ANSÄTZE

User Interaction in Smart Environments

35

The W3C Multimodal interaction Framework - Overview

Quelle (auch für die nächsten 2 Folien): www.w3.org/TR/mmi-framework/

36

The W3C Multimodal interaction Framework – Input Side

37

The W3C Multimodal interaction Framework – Output Side

38

Ansatz von Sottet et al.

Quelle: http://www.springerlink.com/content/t441q8wk3n48307p/

39

Eine Laufzeitarchitektur nach Clerckx et alb

oo

ks.go

og

le.de/b

oo

ks?id

=W

ktQJS

BK

Y50C

&p

g=

PA

339&lp

g=

PA

339

40

Die MASP Architektur nach Blumendorfo

pu

s.kob

v.de/tu

berlin

/volltexte/2009/2325/p

df/b

lum

end

orf_m

arco.p

df

41

DAS UI FRAMEWORKVON PERSONA

User Interaction in Smart Environments

42

Dialog Descriptions

Goal: Modality- & Layout-neutral

PERSONA solution inspired by XForms

Apparently the most advanced form-based solution

Separating the form UI description from the form data

Define a “dialog package” based on XForms UI controls

Use own RDF-based data model instead of adding a new complexity

43

The Dialog Package

44

Cornerstone: I/O Buses

Capabilities of the I/O handlers

appropriateness for certain access impairments

supported languages and modalities

locations where output can be presented

modality-specific tuning capabilities

Dialog ID

The Brokerage / Adaptation

45

Parameters provided by the app

Content language & privacy level Addressed user

Parameters added by the UI

Framework

the presentation location and modality

access impairments to be considered

modality-specific recommendations

Supporting the Output Bus in Adaptation

Situation Reasoner

Context History

Entrepôt

Facts / Rules

SPARQLEngine

Dialog Manager

output bus

handle

publish update

fetch

Profiling

applicationsapplicationsapplicationsapplications

I/O handlersI/O handlersI/O handlersI/O handlers

46

Coherent representation of the whole system

Management of Dialogs

▫ Per user & priority-based management of dialog queues

▫ Suspending dialogs and continuing later

Providing the system main menu

Handling context-free input

More on the Dialog Manager

47

Application 2

Input

Input Bus

Input Handler Input Handler Input Handler Input Handler

Application 1

Speech Recognition

Gesture Recognition

Input Handler

Subscribe Subscribe

Publish

48

Input Fusion

Switch On !

Input Bus

Application 1

Input HandlerTV set Switch On

Switch On TV set

Speech Recognition

Gesture Recognition Fusion

49

Application 2

Output

Output Bus

Output Handler Output Handler Output Handler Output Handler

Application 1

Text 2 Speech

Output Handler

Subscribe

Publish

50

Context Awareness

Output Bus

Application 1

Take your ProzacTM !

Text 2 Speech

Privacy Awareness

51

Context Awareness: Dynamic

Output Bus

Application 1

52

Input Bus

shared base

«interface»Bus

AbstractBusBusStrategy

«interface»InputtBus

InputBusImplInputBusStrategy

implements

implements

53

Input Bus Members

#InputPublisher(BundleContext)+publish(InputEvent)

InputPublisher#InputSubscriber(BundleContext)#addNewRegParams(String)+dialogAborted(String)+handleInputEvent(InputEvent)

InputSubscriber

shared base

«interface»BusMember

«interface»Publisher

«interface»Subscriber

Abstract

Abstract

54

Functional Model of Input Events

#InputEvent(PResource, Location, String)#InputEvent(PResource, Location, Submit)+getDialogID() : String+getInputLocation() : Location+getInputSentence() : String+getParentDialogURI() : String+getSubmissionID() : String+getSubmittedData() : PResource+getUser() : PResource+getUserInput(String[]) : Object+hasDialogInput() : boolean+isServiceSearch() : boolean+isSubdialogCall() : boolean+isSubdialogSubmission() : boolean

InputEvent

55

Output Bus

shared base

«interface»Bus

AbstractBusBusStrategy

«interface»OutputtBus

OutputBusImplOutputBusStrategy

implements

implements

«uses» «uses»

«use

+checkNewDialog(in oe : OutputEvent) : boolean+dialogFinished(in dialogID : String)+getSuspendedDialog(in dialogID : String) : OutputEvent+suspendDialog(in dialogID : String)

«interface»DialogManager

56

Output Bus Members

#OutputPublisher(BundleContext)+abortDialog(String)+adaptationParametersChanged(OutputEvent, String)+dialogSuspended(String)+publish(OutputEvent)+resumeDialog(String, PResource)

OutputPublisher#OutputSubscriber(BundleContext, OutputEventPattern)+adaptationParametersChanged(String, String, Object)#addNewRegParams(OutputEventPattern)+cutDialog(String) : PResource+dialogAborted(String)+dialogFinished(Submit, boolean)+handleOutputEvent(OutputEvent)#removeMatchingRegParams(OutputEventPattern)

OutputSubscriber

shared base

«interface»BusMember

«interface»Publisher

«interface»Subscriber

Abstract

Abstract

Abstract

57

Output Event Properties & their Providers

addressedUser (app)

contentPrivacyLevel (app)

channelPrivacyLevel (DM)

dialogForm (app)

dialogPriority (app)

hasAccessImpairment (DM)

outputLanguage (app)

outputModality (DM)

altOutputModality (DM)

presentationLocation (DM)

Beispiele von modalitätsspezifische Parameter (DM)

screenResolutionMaxX screenResolutionMaxY

screenResolutionMinX screenResolutionMinY

voiceGender voiceLevel

58

Modelling of Access Impairments

impairmentLevel: {none, low, middle, high, full}

AccessImpairment

HearingImpairment SightImpairment

ColorBlindness NearSightedness FarSightedness Astigmatism LightSensitivity

PhysicalImpairmentSpeakingImpairment

59

Danke für die Aufmerksamkeit

&

bis zur nächsten Vorlesung