Uncategorized

hi hopefully you all can hear me great so welcome back to uh day two of our phd summit so i wanna welcome back all of the microsoft research award recipients and also all of the commanded flip fellows this morning so i didn’t get to meet you all yesterday but my name is meredith ringel morris and i am a senior principal researcher here and the research area manager for interaction accessibility and mixed reality and i am also the chair of the microsoft research dissertation grant program so i’m excited to uh be with you all virtually this morning it’s really nice to meet you all um so i think we’re gonna move on to the next slide in the deck great so i wanted to give an overview of this morning’s agenda so after this introduction we’ll be moving on to a presentation on big data research ethics and human rights by mary gray who’s a senior principal researcher at microsoft research’s new england lab and who you may have heard of recently in the news because she recently received the macarthur genius grant for her uh research so i think this should be a really exciting talk to look forward to then we’ll have a short break everyone can stand up stretch get away from their team’s windows for a few minutes and then there’ll be some time for uh one-on-one meetings with the researchers and i believe we’ll be using the mingler app again for the one-on-one meetings also during the one-on-one meetings uh there’s going to be a chance to connect not only with the researchers and to network amongst yourselves as well meet the other students but also there will be some recruiters uh matthew and vicus will be our recruiters in case you had questions for them about internships postdocs or full-time positions with microsoft generally or microsoft research specifically after the networking in the mingler app uh we’ll be coming back to teams to have a panel discussion about careers at microsoft research and denae ford robinson who’s one of our more recent researchers and is a past recipient of the microsoft research fellowship we’ll be leading that panel conversation um and then we’ll get back together at the end of the day to wrap up and address any final questions you all may have so let’s move on to the next slide so this again is just a reminder about how to access the mingler app that we’ll be using for the breakout session and a reminder again that you can especially look for matthew and vacuous if you have any questions about recruiting related aspects during that and next slide so uh i’m gonna uh take a pause now to let you all see a video message that kevin scott who is the company’s chief technical officer um and is also the executive vice president of technology and research of which microsoft research is a component he has a recorded message to share with you all so let’s take a look at that hi i’m kevin scott thank you all so much for taking the time to join me today i’d like to share with you why i think microsoft is such a special place to pursue your curiosity and passions microsoft’s mission and culture the breadth of the research and engineering that we do and the many ways that we touch the lives of individuals and organizations every day allows and encourages individuals to grow professionally and to make a meaningful positive impact on the world you all are at a really meaningful inflection point in your lives and careers and the decisions that you make now will influence so much in the future many years ago i was right where you all are today as a student i had the good fortune to intern at microsoft research in a group that was then called the programmer productivity research center my graduate work was in compiler optimization and programming languages and i vividly remember how special it felt to be able to work alongside so many of the most talented individuals in my profession doing work that had a chance to influence a technology revolution over my career i’ve spent a lot of time at different companies in the industry and i can say that the culture and opportunities here at microsoft right now are truly special today my job at microsoft is helping

everyone do research and engineering in a way that allows microsoft to best achieve its mission that mission is why i’m here to empower every individual and organization on the planet to achieve more in addition to serving as cto for all of microsoft i run a division called technology and research which includes microsoft research incubations and advanced development our challenge in technology and research is to help empower all of microsoft to deploy the most advanced technology in the world to serve the needs of those who use our products and services in some cases that means working directly with our engineering teams building and shipping product and in some cases that means trying to see past the frontiers of what is currently possible advancing the state of the art and knowledge and helping to bring new science and technology into existence just as one example of this work from basic research to deploying infrastructure technology and research is helping microsoft expand what’s possible with ai and machine learning you can find our work in leading peer-reviewed publications and open source contributions in partnerships with leading ai organizations like openai and in microsoft’s ai products including our work on ai supercomputing large-scale model training and advanced reinforcement learning but ai is hardly the only thing that we’re working on our work spans from healthcare and synthetic biology to databases distributed systems and numerical optimization to ethics ai safety and climate change and nearly everything in between as we try to ensure that both microsoft and the world have the technology we will need for a prosperous future we really value and encourage deep curiosity intellectual risk-taking and the sorts of long-term thinking that can help us stay at the cutting edge and be out in front of the next big things surely to arrive in a rapidly evolving technology landscape our job is to help you connect your ability to create those technologies of the future to microsoft products and services that impact billions of people around the globe and in doing so to learn build your career and hopefully have a lot of fun thank you all so much for being here today and i hope this has given you even a small bit of inspiration to consider the amazing things that are possible here at microsoft i look forward to having the chance to see the incredible and meaningful work that you all do in the future thanks for that virtual message kevin and now it’s my pleasure to introduce a live presentation that you’re very lucky to have i’m excited to see it uh mary gray from the new england lab is an anthropologist and a media scholar and she focuses on how technologies transform people’s lives recently she was the co-author of the acclaimed book ghost work which examined the ways that the crowd sourcing and human computation industries impact the lives of the workers themselves and she was recently recognized for her work with the macarthur award and so uh thank you mary for joining us today and please uh take it away hey thanks so much mary it’s really my pleasure hey everybody it’s really good to see you out there this is of course an uh not the typical uh summit but hopefully we’re gonna get a chance for some discussion um what i’d like to share with you today is some work i’m doing just uh i’m thinking about what is it that we can do to think about the ability for technologists and for our theories to start addressing social inequity and to really make that core to our approach to what we build so let me share with you a deck that i’m going to use just to prompt the conversation and um and hopefully we’re going to have some time for some conversation after that so i’ve fussed a little bit with my title and i want to suggest that when we’re thinking about social equity in many ways we’re colliding with the um the different incentives sometimes conflicting incentives of big tech as industry as consumable goods and products and services and questions of human rights that we’re only beginning to see come to the fore when we’re building out technologies that really do integrate with people’s everyday lives and i know for the um for the phd students we have at the summit you come from a range of disciplines so for some of you what i’m about to say is not new news but what i want you to think about are what are pieces of what i have to say

that perhaps you can bring into your conversations with your peers as we think about a very interdisciplinary project to address social inequities and to really bring tech to the table not to fix what’s wrong but to support us in our efforts to address injustice and systemic um asymmetry power asymmetries and really systemic inequity so um back when i started looking at the internet and studying um some of my earliest work understanding how do young people in rural parts of the united states particularly who are coming out as lgbt lesbian gay bi trans or questioning or queer what difference does the internet make if they’re in rural places where it’s not easy for them to find support and the internet looked pretty much like this it was around the late 90s i’m dating myself here but the internet was effectively treated as a you know the equivalent of a phone book or a large index system we hadn’t really imagined what kind of social connections we’d be making with the internet cs and engineering were certainly in the mix at this moment and much of the research methods out there for understanding the networked world we were building it looked like mining data sets you know cleaning structuring somewhat messy existing data sets that look like scraping websites and coming up with really cool crawlers that could gather up data points for us to analyze maybe buying new structured data and figure out figuring out how to connect it with other pre-existing data sets and using telemetry data figuring out what can we learn from things like keystrokes and mouse movement well since the 2000s much of computer science and engineering if it’s thinking about these questions of whether i want to look at health outcomes or i want to understand the criminal justice system we’re really interacting with society we’re often doing a b testing in real time to see the impact of our prototypes we might be connecting previously disconnected data sets from using the us census and connecting it to search logs and cross referencing it with say twitter um material tweets that might tell us somebody’s political leanings and importantly and i feel like this is really the the next level to take artificial intelligence research is having people annotate their own data telling asking them if they will tell us what was behind the decisions that they’re making in fact to get higher quality data we really need to have more social context we need to understand what is it that um is behind the data points we’re reading so since the 2000s and really those early those late 19 1900’s which is a scary word scary way to put it what’s changed for cs in engineering well i would argue more and more of us and you are asking social and behavioral questions that require experimentation you’re not just measuring what people are doing you’re prompting people you’re looking to perturb you’re looking to experiment and learn from human input we are often creating new methods for the job whether that’s figuring out how to survey people at scale through mobile technologies or to come up with new methodologies for sensing through the internet of things to be able to capture people’s day-to-day experiences and then you know a part of all of the new methodology methodologies we have it’s the availability of social data we’re literally studying human interaction in the nitty gritty through looking at chat logs or search logs as i mentioned and looking at that at scale trying to figure out what are the differences between groups and populations we’re taking that material to understand society there are disciplines that have been doing this for more than 100 years their disciplines like mine anthropology sociology political science often those disciplines have methodologies that are assuming that we’re seeing people across a clipboard or across a table we’ve never had the capacity to look at such large-scale human interaction before so some of my world of anthropology is connecting with the other part of my world which is you all computer science

and engineering but what’s what’s missing from our new approach of being able to work with large-scale data arguably cs in engineering but also the other social sciences haven’t updated their ethics and i don’t mean this in terms of principles i want to talk to you about our practices so when we think about what’s the most important thing for us to unlock society’s problems we’re often thinking we’ve got to get the data and the more data the better to understand society’s uh problems and challenges and the questions of inequality arguably cs and engineering have spent the past two decades learning how to secure and keep private data but if you translate that into the language of protect and defend you can see how in many ways our our moves to be able to understand data also keep us at a substantial distance the more data we can get the better so we kind of get data obsessed and i’d like to bring a feminist critique to this attitude of the more the better and if i have it it’s mine to do with what i what i’d like with a feminist critique to say when we whenever we find ourselves possessive protecting defending data we might ask ourselves who am i locking out of my analyses who am i preventing from engaging me in understanding this material and particularly when we hang our hats when we rest so much on de-identification that you know as long as idea identify the data i can do anything i want with it we are ignoring that fundamental question do i have a right to go through this material and when i go through this material what am i losing who am i leaving behind when i’m not thinking of what we call data provenance where did the data come from and most importantly under what conditions for me this is so important when i think about covet data because we know the majority of cova data right now coming in is disproportionately coming from communities of color and low-income communities that have high health risks and so it seems so important for us to not just rush in to protect and defend that data and learn what we can because the moment is so critical but equally to respect who’s produced that data how do we show humility and in many ways deference to the needs of those communities as part of our research methodology so when you’re studying people the most important thing you’re doing is holding yourself accountable to those people who have generated the data and the data collection is really secondary why because you want people to feel they can trust you with the information they share with you so that the next research scientists your graduate students your colleagues down the road will always be able to ask those really poignant sometimes intimate sometimes quite private questions that people would otherwise wall us off from if they feel that that we’re going to be intrusive about how we learn about their lives we’d have to recognize that diverse groups have distinct risks when we’re looking at their information and ideally we’d harmonize the scientific questions that we’re asking with what people need that’s not to say that the cool questions we might come up with aren’t uniquely impactful have importance particularly for science but when we disconnect it from what is it that people actually need from science we are again missing that opportunity to address the kinds of social inequities that build over time when we’re not tending to and tracking those critical questions that society needs every science not just computer science and engineering to address i’m making a case and this isn’t some other writing that i’m doing that our experience of techlash is precisely because the general public is tired of the way we approach our use of data that they don’t quite know what it is that bothers them but they’re quite clear that they’re bothered by just the amount of reach that we have as technologists but also as tech researchers when we’re trying to build out systems and learning from all the materials they’ve generated online so i have here a very brief arguably spotty there’s i could come come up with us

what i call social fail pretty much every single day of the week that tracks the impact and equality issues that big tech but by proxy cs and engineering as disciplines and all of us who are feeding it continue to make overtime time um every one of these examples of a social fail uh i want to offer to you has a lesson to teach us about how we would move forward but be you know what what do these have in common in every case we see the use of data scraped kept de-identified but without clear data provenance we don’t actually know did somebody want us to be able to repurpose that material so that we could build say a classifier for a feature space like facial recognition the current public conversation around facial recognition is a really good one there are questions we could have asked before we ever built that kind of feature whether or not it would have shut down the creation of that feature we don’t know but there are questions we could have asked that would have tuned us in to the fault lines we were treading on that we otherwise are somewhat oblivious to when we build before we ask who are we building for why are we building what we’re building often when we are just building for building sake our stakeholders are ourselves which is not a crime our institutions which are usually our universities and who we imagine might be able to access the data and so when we talk about different identified data we often are thinking we’re sharing it with colleagues who are going to build something else or perhaps other tech companies who are going to build further products and service services again there’s in itself nothing wrong with that but note who we’re leaving out as a stakeholder engaging us and calling out where we’re overreaching or missing the mark for what they need the second thing these social fails have in common is this data centric mindset this grabbiness and this lack of care for the the con the circumstances under which the data were accessed and then arguably and this is so poignant today as we have conversations about um the homogeneity of our data sets how often our data sets are missing entire groups of people and also disparaging to particular particular groups of people that we have no training currently today across the board that would help us recognize groups of people or society who are key stakeholders who should be at the table when we’re developing these technologies i’m going to give a quick shout out to mary morris’s group because she has been a leader in this this space when it comes to developing accessible technologies and thinking about the role of abilities and disabilities and communities who can inform us about how to build what we build um to be able to see the kind of partnership and co-design that’s possible there but that means particularly that researchers have to be thinking not just with their gut but with a very explicit way of imagining how do i experiment with those stakeholders in the room in very explicit ways so i want to offer just quickly a case study for a comparison to say we’ve actually been in this place before of social fails it has to do with disciplines basically reaching trying to test out new methodologies and again not having ethical practices catch up with those methodologies as we search for truth so i love giving this experiment i’m going to breeze through it quickly i’m happy to talk about it later but it comes from psychology in the 1960s and 70s as the world was becoming far more intercultural this basic fundamental psych psychological and physiological question of what difference does it make if i’m standing next to someone was such an important question to ask because we quite literally were meeting different community societies that had different norms and practices around how they physically held themselves and introduced themselves or sat next to each other so it’s a really interesting really fundamental question how do we affect each other in space and time i love this question because it intrinsically has so much value there’s to science to understanding humans and humanity and society it’s not entirely clear what’s the

social application but you can imagine what it is but here’s the methodology they came up with for studying this really really important foundational question it was let’s send research assistants into public bathrooms and have them stand next to men who are urinating and what i find fascinating about this case as a comparison to the social fails of technology today is that it was driven by this amazing new technology a periscopic um lens on a camera that could take rapid-fire photos of urine streams to be able to to see the timing to be able to get a sense of constriction physiological constriction of the body when somebody’s standing next to you so really great use of tech and in that case what i love is if you compare this to the social fails that i um zip through of technical of big tech what they have in common with middle mist is a novel method that’s introducing new ethical challenges that are not obvious when you’re using those methodologies but become painfully obvious when you have society respond to your research so science is always posing risks to people that’s what experimentation is about we can’t guarantee that we’re not going to harm people we can only try with a lot of intention to think through what risks we are posing to people and often the risks of today are not the risk to the individual who created a data point their risk to the groups that that individual models that that person is a stand in a proxy for their social connections and we can’t know whether just that one person is harmed but the greater risk is to society no longer wanting to engage science when we use people for research without asking or explaining what we’re doing we’re undermining the public’s trust in us and that is going to be this incredibly important insight to apply to the future of what we do so there’s some ethical principles that are in place again some of you might be familiar with them but they really are about equitable impact of research practices so these tenants are called the belmont um principles or the common rule they’re they’re what you have to know if you’re an anthropologist like me and you want to do any interviewing or participant observation ethnographic research you have to show how is your research methodology going to demonstrate respect that almost always means what’s your consent process what’s meaningful about it are you communicating what you’re doing and should you be asking people on an ongoing basis are you okay with participating in my research every piece of research that involves people requires you to be thinking who am i leaving out how am i caring for someone who might be situationally made more vulnerable by the research i do for example if i’m studying young people coming out in rural appalachia i better be thinking about how to include young people who are both out to their parents and still closeted how do i include people who identify as trans non-binary how do i think about reaching people who are otherwise a bit out of reach and do it in a way that respects but also recognizes the potential vulnerability i’m introducing with the question i’m asking and then beneficence is actually one of my favorites it’s to think through who are these stakeholders who else needs to benefit from the work that i’m doing have i thought through who could be put at risk and what am i doing to mitigate mitigate those risks to those myriad stakeholders who are out there because it’s rarely if ever one stakeholder who’s in the picture but i’d argue there’s actually something missing and i want to make a case for mutuality shout out to my colleague sasha costanza chalk who wrote a beautiful book called design justice if you haven’t read it i highly recommend this book but it’s an opportunity for us to think as researchers how do we build out who we count as a domain expert who needs to be at the table who needs to inform us about the direction of our research and not to see them just as informants but quite literally to see them as collaborators as those missing subject matter experts who can tell us which direction to go with our research design so i think the acm is actually um moving us in this direction anyway they

did a revision of their code of conduct for computing professionals in 2018 and it’s specifically pushing all of us to think about all people our stakeholders when it comes to computing because computing is ubiquitous it’s no longer like a nerdy thing that a few nerds do it’s it’s literally in everybody’s lives in ways that they no longer see because it’s become such a part of our background and also that it’s on us to be thinking about the possible risks and i assure you i promise you thinking about those risks does not mean don’t do research it means be principled and be meticulous and rigorous about the questions you ask to think through risks and then lastly you’re thinking about the public good is always going to be part of your project you know my good friend virginia eubanks who wrote another beautiful book called programmed equality has a really fantastic way of putting this that if you’re not working towards equity if you think it’s something that’s just a neutral realize that everything that we have in our world is built with these hierarchies and asymmetries of opportunity in place if you’re not doing actively something to counter that you’re perpetuating what’s there so that means if inequity is just a part of our lives it you have to have such consciousness to push that inequity and move towards greater fairness equality in the world so sorry my slide is stuck microsoft actually has registered federally to hold ourselves to these research principles and ethical practices i’m really proud to say that we do that we’re the only industry research setting that has an ethics review program and an institutional review board so that means you can keep publishing when you’re here at microsoft research if you’re publishing for venues that require an ethics review i’m incredibly proud of that and i hope that it’s something that our peers will will adopt many of them have asked us about our program so i’m hopeful that we’ll all start holding ourselves to the standard sorry my slide is freezing again my last slide so the takeaways for today that i i hope um that you’ll share with your colleagues is to think about how much our work requires an approach that engages with people if we want to be able to innovate we’re going to need higher quality training data if you’re an into ml and that translates into needing to understand why people have made the decisions they’ve made contextualizing the models that you’re building we have to get socially relevant data and that means we need new research practices i would argue not new ethics and again mary morris sets the bar for this new path those new research practices and anytime we’re innovating we’re going to rely on people um biomedicine learned this a long time ago these basic federal ethics guidelines that i mapped out for you respect beneficence justice those are non-negotiable you can’t do work that engages people and people’s data in other disciplines without holding yourself to a deep analysis of those principles and practices before you start your work so what would it look like if we held ourselves to the same rules if data science cs engineering had to play by the same rules as biomedicine could it help prevent or at least lessen the social fails in our in our industry in our disciplines because honestly sciences you know it’s it’s compact with society is minimized risk we can’t take a an oath that will prevent harm we can commit to working to reduce risk and maximize benefits so it’s reckoning with everything we do as risky sometimes it’s low risk but none of us are in a position to say or to assume what’s risky for the people who are contributing data we have to empirically check that so i hope we become data driven with that and bring communities with us and into this research process so that we have much more mute to do our work so i think with that that’s my last um yeah that’s my last slide i had a shameless plug for the book that’s the book my co-author siddhartha is also at microsoft research um so and i’m trying to get back to there we

go got it and i hope i left us enough time to be able to have a little conversation i see some we have a couple of questions and please feel free to keep bringing on the questions folks but i’ll i’m looking at the questions now they’re coming into my feed uh one question playing devil’s advocate the problem with securing data is to me transparency people hide problematic behavior behind the wall of privacy i feel having open access to data would reveal information that would help dealing with problems like the pay gap and creating a healthy a health informatics system with better information how is getting rid of privacy a bad thing assuming we can convince the population it is to their benefit in the ways i described let me lead with your parenthetical convincing populations that it’s to their benefit to let us look at their data without them knowing it and without them having a say is a big ask and i think particularly if you look at the history of research ethics and i do that example of middle mist there was public outcry that psychologists had the hubris to think it was okay to prioritize their scientific question without considering how it might be diminishing to human dignity to basically be watching other people stand and pee now the more we vehemently as scientists say but it’s for your own good the harder it is for us to see that’s actually called the utility and kantian logic when you apply that logic it’s assuming you have a right to make that decision for someone else so the the devil’s advocate question is super super important the problem is really what paradigm do we want to use for the ethics of handling people’s data if we use that utilitarian logic we’re locking ourselves into always deciding and really holding the power to decide when people should share data with us and you could say it’s hard to call it sharing because it’s in your desktop that’s incredibly important for me to know and it could contribute to science would you ever think it was appropriate for me to go into your desk drawer and take it i think most of us would say no but ironically methodologically if i asked you i told you why it was i wanted pretty good you’d give it to me and if you didn’t you’d be able to explain to me what feels risky about sharing that information with me so again it’s really tracking what’s the objective here if we’re what is if i want to just be crass about it the most effective task tactic for doing that it’s it’s asking up front um i don’t and i should say i’m not i’m not asking get rid of privacy i’m mostly saying rethink how we approach that question i hope that addresses the question so new question people’s behaviors don’t change until the incentive systems change at a group or individual level right now the research community is very white asian very male if a paper studies only black people the community sees it if a paper only studies white people this is seen as normal a normal paper how do we change the incentive systems of the entire research community to encourage and and really force people to rethink their own biases in their research themselves do what’s best out of the goodness of their hearts that wow do i feel that question yeah i’m gonna i’m gonna reply with a bit of a story to say my earliest work was around was talking about sexuality and gender and was specifically looking at lgbt people and for a part of my career most folks thought that’s just about that’s just about lgbt people and young people no one really saw it as a case of what it means to be part of a community that prioritizes being familiar to each other so you know the challenge when you’re studying a group that is distinct and i bet mary morris could shine in on this that when you’re studying a group that’s distinct you as a researcher

probably know you can translate that and say this is a case of something that is a broader question that translation work often falls and this example is perfect like when you’re just studying normal people that’s called an unmarked category when you’re just studying people and implicitly it’s a matter of studying um how you how ubiquitous how whiteness is a def fault it’s very hard to have enough of us calling out the assumptions of um that the the the average user um erases how distinctively we use technologies i’m going to draw it to this larger problem in technology studies is that we tend to get focused on the technology in that language of end user that’s been beautifully critiqued in hci so whenever we’re talking about technology uses we should always all be asking use by whom in what context how because those three questions will always matter they shape what technologies mean to people and that means there is no typical user there is no generic abstraction of a person who’s using technologies people are always using it from the place they sit in so i don’t think we have to wait for for people to think you know it’s the out of the goodness of their hearts they they more broadly address um how distinctive groups are my hope is that we see as a research methodology um it’s no longer acceptable to say you’ve studied twitter and twitter users do x that should be nonsensical to all of us it’s empirically not sound because we know groups of people use twitter in different ways i’m just picking on twitter as an example that’s really important when we’re talking about training data because there is no training data that doesn’t have to be cracked open and thought of as particular groups of people so who’s missing in those groups of people what does a representative sample mean it means nothing outside of your research question the representative sample for my research if i’m studying young people in rural southeast apple what does representative sample mean in that research question so you know i’m saying all this it’s really pushing on our assumptions about a kind of generic user that’s challenging in cs and engineering because so much of um design hinges on thinking well there’s some assumptions that i can make i thoughts are off with that now i think we hopefully increasingly see that um so a new question in here and please let’s may raise their hand on my screen maybe ryan to tell me if i’m almost out of time i think public mistrust of private company is completely understandable considering how they have dealt with climate change love that you’re bringing this up questioner big tech fails certainly added fuel to an existing fire how do we go about repairing the trust between individuals and large societal structures such as companies and government how can i trust an entity that is ultimately profit driven bingo so this is to the heart of the argument that again i think particularly i’m speaking to you as phd students are future if we make it like biomedicine impossible for you to contribute to a profit margin without also including attention to the needs of society that you have to hit some basics then we’re going to be course correcting slowly but surely this large ship of tech that has really never been called to the table to say you actually have to be socially accountable if you are in people’s day-to-day lives so thickly your point about climate change is a great one because that’s an example of the argument about externalities that come with building tech were never brought back to tech companies to um to hold and hold that held them accountable until now and now tech companies are all rushing to be the greenest tech company so we don’t want to wait for popular opinion like you all will seed and pressure every company you either work for or contribute knowledge to as researchers that’s that’s the thing to take away that you’re going to have an opportunity through the work you do to be able to push every company you need to show me foundational

consent before i’ll work with you like make that something that is just non-negotiable for you your peers and for our companies i think with that i need to stop so you all get a break but it’s been such a pleasure to be here thank you so much mary oh am i on thank you so much mary that was really interesting and i have to admit i was not familiar with the public urination study so i definitely learned a new uh cocktail party factoid today that i look forward to using in the future so thanks again um for uh for our fellowship and grant winners and again for the flip fellows as well i hope you enjoyed that as much as i did next we have a little stretch break a get away from your computer break and then at 10 30 many of you have one-on-one meetings scheduled with researchers check your calendars you should have a separate teams call link that you will need to use to join in those one-on-one meetings and then uh beginning at 11 we’ll have our networking break on mingler um and again our recruiters will also be there uh if your one-on-one meetings finish up early and you want to get on mingler early to network with each other that is also fine so remember the url is here in the slide for joining mingler and you’ll sign in with either google facebook or your email option and then after the mingler application we’ll meet back here at noon on this team’s call to hear the panel discussion so thank you so much and enjoy your one-on-one meetings and we’ll see you on mingler in a little bit

You Want To Have Your Favorite Car?

We have a big list of modern & classic cars in both used and new categories.