<div dir="ltr"><div dir="ltr"><div style="min-height:100%;color:rgb(32,33,36);font-family:"Google Sans",Roboto,RobotoDraft,Helvetica,Arial,sans-serif;font-size:medium"><div style="width:1512px"><div><div style="display:flex;background-color:transparent"><div style="overflow:hidden"><div style="border-radius:16px;margin-bottom:16px;overflow-y:hidden"><div><div><div><div id="m_-7648385903243039620gmail-:3" style="background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial;overflow-y:scroll;padding-right:0px;height:612px"><div id="m_-7648385903243039620gmail-:1" style="padding:0px;vertical-align:bottom;min-height:422px"><div><div role="main"><div><div style="background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial;min-height:64ex;min-width:502px;margin:0px;padding-right:16px"><div><div style="background-color:transparent;color:rgb(34,34,34);min-width:502px;padding:0px"><div role="list"><div role="listitem" aria-expanded="true" style="padding-bottom:0px;max-width:100000px;clear:both;outline:none"><div style="margin-bottom:0px;border-width:0px;border-top-style:solid;border-right-style:initial;border-bottom-style:initial;border-left-style:initial;border-top-color:rgb(239,239,239);border-right-color:initial;border-left-color:initial;border-bottom-color:initial;border-radius:0px;float:left;width:1168px"><div style="border-top:none;padding-top:0px;background-color:transparent;border-right:0px;border-bottom:0px rgba(100,121,143,0.12);border-left:0px;border-radius:0px;margin-bottom:0px;margin-left:0px;margin-right:0px"><div id="m_-7648385903243039620gmail-avWBGd-118"><div id="m_-7648385903243039620gmail-avWBGd-119"><div style="border-left:none;padding:0px;display:flex"><div style="margin:0px;min-width:0px;padding:0px 0px 20px;width:auto"><div><div id="m_-7648385903243039620gmail-:25b" style="direction:ltr;margin:8px 0px 0px;padding:0px;font-size:0.875rem;overflow-x:hidden"><div id="m_-7648385903243039620gmail-:25a" style="direction:ltr;font-variant-numeric:normal;font-variant-east-asian:normal;font-variant-alternates:normal;font-size-adjust:none;font-kerning:auto;font-feature-settings:normal;font-stretch:normal;font-size:small;line-height:1.5;font-family:Arial,Helvetica,sans-serif;overflow:auto hidden"><div id="m_-7648385903243039620gmail-avWBGd-135"><div dir="ltr"><div dir="ltr"><p class="MsoNormal" style="margin:0cm 0cm 7.5pt;text-align:justify;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial;font-size:12pt;font-family:Aptos,sans-serif"><b><span style="font-family:Arial,sans-serif;color:black">PhD in visual cognitive computational
neuroscience </span></b></p><p class="MsoNormal" style="margin:0cm 0cm 7.5pt;text-align:justify;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial;font-size:12pt;font-family:Aptos,sans-serif"><b><span style="font-size:11pt;font-family:Arial,sans-serif;color:rgb(23,23,23)">Supervisor: </span></b><span style="font-size:11pt;font-family:Arial,sans-serif;color:rgb(23,23,23)">Dr. Kamila Maria Jozwik, Jozwik lab, University of
Cambridge </span></p><p class="MsoNormal" style="margin:0cm 0cm 7.5pt;text-align:justify;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial;font-size:12pt;font-family:Aptos,sans-serif"><b><span style="font-size:11pt;font-family:Arial,sans-serif;color:rgb(23,23,23)">Application deadline: </span></b><span style="font-size:11pt;font-family:Arial,sans-serif;color:rgb(23,23,23)">13th March 2026</span></p><p class="MsoNormal" style="margin:0cm 0cm 7.5pt;text-align:justify;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial;font-size:12pt;font-family:Aptos,sans-serif"><b><span style="font-size:11pt;font-family:Arial,sans-serif;color:rgb(23,23,23)">Application link: </span></b><span style="color:black"><a href="https://www.postgraduate.study.cam.ac.uk/courses/directory/cvbspdbsc" style="color:blue" target="_blank"><span style="font-size:11pt;font-family:Arial,sans-serif">https://www.postgraduate.study.cam.ac.uk/courses/directory/cvbspdbsc</span></a></span><span style="font-size:11pt;font-family:Arial,sans-serif;color:rgb(23,23,23)"></span></p><p class="MsoNormal" style="margin:0cm 0cm 7.5pt;text-align:justify;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial;font-size:12pt;font-family:Aptos,sans-serif"><b><span style="font-size:11pt;font-family:Arial,sans-serif;color:rgb(23,23,23)">PhD fees status: </span></b><span style="font-size:11pt;font-family:Arial,sans-serif;color:rgb(23,23,23)">Home fees only (</span><span style="color:black"><a href="https://www.postgraduate.study.cam.ac.uk/finance/fees/what-my-fee-status" style="color:blue" target="_blank"><span style="font-size:11pt;font-family:Arial,sans-serif;color:rgb(0,114,207)">https://www.postgraduate.study.cam.ac.uk/finance/fees/what-my-fee-status</span></a></span><span style="font-size:11pt;font-family:Arial,sans-serif;color:rgb(23,23,23)">), 4 years, fully funded</span></p><p class="MsoNormal" style="margin:0cm 0cm 7.5pt;text-align:justify;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial;font-size:12pt;font-family:Aptos,sans-serif"><b><span lang="EN-US" style="font-size:11pt;font-family:Arial,sans-serif;color:black">Start date</span></b><span lang="EN-US" style="font-size:11pt;font-family:Arial,sans-serif;color:black">: October 2026</span><span style="font-size:11pt;font-family:Arial,sans-serif;color:rgb(23,23,23)"></span></p><p class="MsoNormal" style="text-align:justify;margin:0cm;font-size:12pt;font-family:Aptos,sans-serif"><span style="font-size:11pt;font-family:Arial,sans-serif;color:rgb(23,23,23)">The Jozwik lab studies visuo-semantic cognition combining cognitive
science, neuroscience, and computational modelling. The lab’s research has
focused on probing specific visual dimensions in the context of face, animacy,
and object representations more generally. We collect and analyse human
behavioural and brain imaging (fMRI and M/EEG) data. We also analyse macaque
electrophysiology data obtained through collaborations and perform
cross-species comparisons. We use machine learning techniques for neural data
analysis and computational modelling with a special interest in
biologically-inspired deep learning and AI models (NeuroAI). The computational
models we work with include vision deep learning models (including
topographical, recurrent, or developmentally inspired models), multimodal
vision and language models, and Large Language Models. Please find prior work
here: (Google Scholar: </span><a href="https://scholar.google.com/citations?hl=en&user=oEifmSgAAAAJ&view_op=list_works&sortby=pubdate" style="color:blue" target="_blank"><span style="font-size:11pt;font-family:Arial,sans-serif;color:rgb(0,114,207)">https://scholar.google.com/citations?hl=en&user=oEifmSgAAAAJ&view_op=list_works&sortby=pubdate</span></a><span style="font-size:11pt;font-family:Arial,sans-serif;color:rgb(23,23,23)">).  </span><span lang="EN-US" style="font-size:11pt;font-family:Arial,sans-serif;color:black">We also began exploring how to apply our expertise in
visuo-semantic cognition and AI to neurotechnology (Focused Ultrasound
Stimulation) and understanding mental health conditions. </span></p><p class="MsoNormal" style="text-align:justify;margin:0cm;font-size:12pt;font-family:Aptos,sans-serif"><span lang="EN-US" style="font-size:11pt;font-family:Arial,sans-serif;color:black"> </span></p><p class="MsoNormal" style="margin:0cm 0cm 7.5pt;text-align:justify;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial;font-size:12pt;font-family:Aptos,sans-serif"><span style="font-size:11pt;font-family:Arial,sans-serif;color:rgb(23,23,23)">The PhD student is welcome to work on one (or more)
of the three aspects of the research programme funded by the Royal Society that
aims to disentangle and model behaviourally-relevant visual and semantic
dimensions (characteristics of objects: ”curved”, ”pink”, ”having eyes”, “being
animate”, ”having agency”, or ones that are hard to name) of visual cognition
in the human brain, while increasing the ecological validity of experiments
(including mobile EEG and immersive technologies), in the light of the below
three aims. Note Dr. Jozwik would be happy to discuss PhD projects related to
these aims, as there is some flexibility in research directions.</span></p><p class="MsoNormal" style="margin:0cm 0cm 7.5pt;text-align:justify;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial;font-size:12pt;font-family:Aptos,sans-serif"><span style="font-size:11pt;font-family:Arial,sans-serif;color:rgb(23,23,23)">1) characterise behaviourally-relevant visual and
semantic dimensions by the use of large-scale brain imaging datasets of
responses to images and model these representations with computational models
and validate these predictions in follow-up neuroimaging experiments,</span></p><p class="MsoNormal" style="margin:0cm 0cm 7.5pt;text-align:justify;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial;font-size:12pt;font-family:Aptos,sans-serif"><span style="font-size:11pt;font-family:Arial,sans-serif;color:rgb(23,23,23)">2) define and model dimensions related to the
perception of animacy when interacting with objects and people using videos
(behaviour, fMRI, and MEG),</span></p><p class="MsoNormal" style="margin:0cm 0cm 7.5pt;text-align:justify;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial;font-size:12pt;font-family:Aptos,sans-serif"><span style="font-size:11pt;font-family:Arial,sans-serif;color:rgb(23,23,23)">3) determine to what extent these brain
representations and dimensions change when humans are immersed in the
environment (VR/AR and/or mobile EEG).</span></p><p class="MsoNormal" style="text-align:justify;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial;margin:0cm;font-size:12pt;font-family:Aptos,sans-serif"><b><span style="font-size:11pt;font-family:Arial,sans-serif;color:rgb(23,23,23)">The ideal candidate</span></b><span style="font-size:11pt;font-family:Arial,sans-serif;color:rgb(23,23,23)"> will have:</span></p><p class="MsoNormal" style="text-align:justify;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial;margin:0cm;font-size:12pt;font-family:Aptos,sans-serif"><span style="font-size:11pt;font-family:Arial,sans-serif;color:rgb(23,23,23)">- extensive experience in programming in Python or
Matlab and data analysis (essential, please note that coursework coding during
an undergraduate or Master’s degree will likely not be enough)</span></p><p class="MsoNormal" style="text-align:justify;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial;margin:0cm;font-size:12pt;font-family:Aptos,sans-serif"><span style="font-size:11pt;font-family:Arial,sans-serif;color:rgb(23,23,23)">- substantial research experience (essential, e.g.,
through research MPhil/Master’s degree, or research assistant job)</span></p><p class="MsoNormal" style="text-align:justify;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial;margin:0cm;font-size:12pt;font-family:Aptos,sans-serif"><span style="font-size:11pt;font-family:Arial,sans-serif;color:rgb(23,23,23)">- experience with behavioural and neuroimaging
(fMRI, M/EEG) data design/collection/analysis</span></p><p class="MsoNormal" style="text-align:justify;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial;margin:0cm;font-size:12pt;font-family:Aptos,sans-serif"><span style="font-size:11pt;font-family:Arial,sans-serif;color:rgb(23,23,23)">- experience in machine learning and AI</span></p><p class="MsoNormal" style="text-align:justify;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial;margin:0cm;font-size:12pt;font-family:Aptos,sans-serif"><span style="font-size:11pt;font-family:Arial,sans-serif;color:rgb(23,23,23)">- a collaborative approach to doing science and
willingness to help other lab members</span></p><p class="MsoNormal" style="text-align:justify;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial;margin:0cm;font-size:12pt;font-family:Aptos,sans-serif"><span style="font-size:11pt;font-family:Arial,sans-serif;color:rgb(23,23,23)">- curiosity and motivation to work on the proposed
or related research questions.</span></p><p class="MsoNormal" style="text-align:justify;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial;margin:0cm;font-size:12pt;font-family:Aptos,sans-serif"><span style="font-size:11pt;font-family:Arial,sans-serif;color:rgb(23,23,23)"> </span></p><p class="MsoNormal" style="text-align:justify;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial;margin:0cm;font-size:12pt;font-family:Aptos,sans-serif"><b><span style="font-size:11pt;font-family:Arial,sans-serif;color:rgb(23,23,23)">Before applying, please contact Kamila Maria Jozwik</span></b><span style="font-size:11pt;font-family:Arial,sans-serif;color:rgb(23,23,23)"> (Royal Society University Research Fellow and
Assistant Research Professor, </span><span style="color:black"><a href="mailto:jozwik.kamila@gmail.com" style="color:blue" target="_blank"><span style="font-size:11pt;font-family:Arial,sans-serif">jozwik.kamila@gmail.com</span></a></span><span style="font-size:11pt;font-family:Arial,sans-serif;color:rgb(23,23,23)"> or <a href="mailto:kj287@cam.ac.uk" target="_blank">kj287@cam.ac.uk</a>). </span></p><p class="MsoNormal" style="text-align:justify;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial;margin:0cm;font-size:12pt;font-family:Aptos,sans-serif"><span style="font-size:11pt;font-family:Arial,sans-serif;color:rgb(23,23,23)">In the initial email, please include: </span></p><p class="MsoNormal" style="text-align:justify;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial;margin:0cm;font-size:12pt;font-family:Aptos,sans-serif"><span style="font-size:11pt;font-family:Arial,sans-serif;color:rgb(23,23,23)">- your CV</span></p><p class="MsoNormal" style="text-align:justify;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial;margin:0cm;font-size:12pt;font-family:Aptos,sans-serif"><span style="font-size:11pt;font-family:Arial,sans-serif;color:rgb(23,23,23)">- information about your programming, computational
modelling, and relevant research, data collection and analysis experience
(fMRI, M/EEG, neuromodulation, electrophysiology, behaviour)</span></p><p class="MsoNormal" style="text-align:justify;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial;margin:0cm;font-size:12pt;font-family:Aptos,sans-serif"><span style="font-size:11pt;font-family:Arial,sans-serif;color:rgb(23,23,23)">- details of journal and conference publications,
preprints, and research theses</span></p><p class="MsoNormal" style="text-align:justify;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial;margin:0cm;font-size:12pt;font-family:Aptos,sans-serif"><span style="font-size:11pt;font-family:Arial,sans-serif;color:rgb(23,23,23)">- Please also ask 2-3 of your referees, ideally
with whom you have worked on research projects, to email their reference
letters to Dr. Jozwik.</span></p><p class="MsoNormal" style="text-align:justify;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial;margin:0cm;font-size:12pt;font-family:Aptos,sans-serif"><span style="font-size:11pt;font-family:Arial,sans-serif;color:rgb(23,23,23)"> </span></p><p class="MsoNormal" style="margin:0cm 0cm 7.5pt;text-align:justify;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial;font-size:12pt;font-family:Aptos,sans-serif"><b><span style="font-size:11pt;font-family:Arial,sans-serif;color:rgb(23,23,23)">Lab research environment</span></b><span style="font-size:11pt;font-family:Arial,sans-serif;color:rgb(23,23,23)">: The Jozwik lab is based at the MRC Cognition and
Brain Sciences Unit, University of Cambridge, with links to broader Cambridge
(e.g., Cambridge NeuroWorks powered by Advanced Research and Invention Agency)
and international scientific ecosystems (e.g., the Center for Brains, Minds
& Machines, now MIT Quest for Intelligence). The Unit has an on-site 3T
fMRI scanner (with access to a 7T fMRI scanner within cycling distance), an MEG
scanner, EEG systems, Focused Ultrasound, Transcranial Magnetic Stimulation,
and dedicated methods and computing support staff. The Unit runs two MPhil
Programs: Cognitive Neuroscience and NeuroAI, and PhD students have the
opportunity to supervise MPhil students. The lab values commitment to rigorous,
open science, supports diversity in all its meanings, and drives curiosity in a
supportive, multidisciplinary, and international research environment.</span></p><p class="MsoNormal" style="margin:0cm 0cm 7.5pt;background-image:initial;background-position:initial;background-size:initial;background-repeat:initial;background-origin:initial;background-clip:initial;font-family:Aptos,sans-serif">























































</p><p class="MsoNormal" style="text-align:justify;margin:0cm;font-size:12pt;font-family:Aptos,sans-serif"><span style="font-family:Arial,sans-serif"> </span></p></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div></div>
</div>