This advert is not available!
Lund University was founded in 1666 and is repeatedly ranked among the world’s top universities. The University has around 47 000 students and more than 8 800 staff based in Lund, Helsingborg and Malmö. We are united in our efforts to understand, explain and improve our world and the human condition.
Lund University welcomes applicants with diverse backgrounds and experiences. We regard gender equality and diversity as a strength and an asset.
Researcher
Period: 190801-200229 or by agreement
Full-time
Researcher in Cognitive Science at the Choice Blindness lab
The Choice Blindness lab at Lund University Cognitive Science studies questions concerning the role of external feedback for cognitive processes ranging from preferences and attitudes to emotion and speech production. The focus of the current position would be to explore self-awareness of the emotionality expressed in the human voice. The position is financed by a grant from The Crafoord Foundation.
Background
The current project is part of a general research program concerning the limits of self-knowledge and introspective report. How much do we know about ourselves, and how do we come to acquire this knowledge? The specific focus of this project would be to explore self-awareness of the emotionality expressed in the human voice. This signal has evolved as a means to inform as well as to influence others - when I sound happy it makes you glad; when there is anger in my voice, what I say carries more weight. But how aware are we of the emotionality we express? And when we express an emotion, do we also influence ourselves by listening to our own voice?
To investigate this, we have adapted techniques from the field of machine voice synthesis to create a platform that can alter the emotional quality of the participants’ speech in real time (e.g. in the direction of happiness, sadness and fear), without imparting any delays that break the natural conversational flow. These changes can also be applied at a level when they are clearly detectable by independent listeners, while most participants still remain unaware that the emotional tone of their voices has been modified.
This technique allows us to study emotional signalling and emotional self-awareness in ways that has never been possible before. For example, to what extent do we listen to the emotionality of our own voice when we describe a previous experience, and what effects may this have on our future recollection of this event? And in a social context, if two people engage in a negotiation or a competitive game like a prisoners’ dilemma, how would the interaction be influenced be the emotional signal of the participants’ voices? Such studies would reveal to what extent emotional self-monitoring is an active component in everyday decision making, as well as in many other aspects of everyday life.
For further detail of work from the Choice Blindness lab, see:
https://www.lucs.lu.se/choice-blindness-group/
For further detail of the voice manipulation platform known as DAVID, see:
http://cream.ircam.fr/?p=44
Subject area
Cognitive Science
Job assignment
To design and perform experiments within the context of emotional self-monitoring in voice, preferably using the DAVID platform.
Eligibility and assessment
The applicant should have a strong background in sound and voice processing, as well as documented experience of designing and performing psychological experiments. Prior experience using the DAVID platform would be considered a strong merit. The position is at the postdoc level.
To submit in the reqruitment system when applying (in addition to general requirements by Lund University)
CV and a project plan of no more than 3 pages
Type of employment | Temporary position |
---|---|
Contract type | Full time |
First day of employment | 2019-08-01 |
Salary | Månadslön |
Number of positions | 1 |
Full-time equivalent | 100 |
City | Lund |
County | Skåne län |
Country | Sweden |
Reference number | PA2019/2081 |
Contact |
|
Union representative |
|
Published | 11.Jun.2019 |
Last application date | 24.Jun.2019 11:59 PM CEST |