17:40:06 Live captioning by Ai-Media Live captioning by Ai-Media Live 17:40:09 captioning by Ai-Media 18:04:31 SHANTHI ROBERTSON: After a false start, 18:04:33 I think we are now broadcasting to our 18:04:36 audience. 18:04:36 Welcome everybody once again stop hello and 18:04:39 welcome to the second webinar in our webinar 18:04:41 series which is brought to you by the ADEPT 18:04:45 project, autonomy disability and diverse 18:04:46 the 18:04:48 technology project. 18:04:49 This is a collaborative Australia research 18:04:58 Council linkage project. 18:04:59 It is run at the Institute of culture and 18:05:03 society of Western Sydney University. 18:05:04 In partnership with a number of disability and 18:05:07 migration service providers. 18:05:08 Art organisations, and technology partners. 18:05:10 My name is Chancey Robinson 18:05:12 and I am a chief investigator on the 18:05:14 project but my main role tonight is to be your 18:05:18 host for this webinar. 18:05:20 Before we begin, I would like to acknowledge that 18:05:23 our university resides on the Darug people 18:05:25 of the nation and I am 18:05:27 calling in 18:05:31 on their traditional land. 18:05:32 They have been owners of the country for 18:05:35 thousands of years and many of you will also be 18:05:41 participating from the unseeded lands of 18:05:43 traditional owners. 18:05:44 The project team wishes to acknowledge and pay 18:05:47 our respects to the elders past, present and 18:05:49 emerging of the lands, always, especially this 18:05:52 week which is no doc week . 18:05:54 The aim of our webinars series is to bring 18:05:58 together experts 18:06:01 from the projects Corp research team, but also 18:06:04 our wider networks, to discuss some of the 18:06:07 critical issues we are all collectively 18:06:09 grappling with in our work. 18:06:11 The theme for this evening's webinar is 18:06:13 crisis, technology and disability service 18:06:15 delivery. 18:06:16 I will introduce our expert panel now. 18:06:18 Our first panellist is Lida Ghahremanlou. 18:06:20 She is an analytics cloud solution architect 18:06:23 and UI 18:06:24 architect in Microsoft UK. 18:06:26 She is joining us very early in the morning. 18:06:29 Snow Li 18:06:30 is our next panellist. 18:06:31 She is 18:06:32 psycho social 18:06:33 project 18:06:37 social project lead. 18:06:38 And our final expert is Liam Magee 18:06:41 , associate Professor of all. 18:06:43 To let you know how 18:06:47 this will run, we will start off with a panel 18:06:50 discussion. 18:06:51 I will ask a few key questions about 18:07:03 of our panel. 18:07:04 If you have questions, we would welcome your 18:07:07 contributions. 18:07:08 Please type them in the Q&A function and in the 18:07:11 second part of the session, I will call 18:07:14 some of these audience questions and try to 18:07:17 cover as many as we can for our experts to 18:07:20 discuss. 18:07:21 We will also have a couple of polls popping 18:07:24 up for our audience. 18:07:26 One will pop up in a couple of minutes and 18:07:29 one towards the end of the session. 18:07:32 It would be great if our audience member's could 18:07:35 contribute their thoughts there. 18:07:36 Let's get started with our discussion. 18:07:38 I am going to start with just asking each of you 18:07:42 a little bit more about the specific work each 18:07:45 of you are doing in relation to technology 18:07:48 and service delivery in your different 18:07:50 organisations. 18:07:51 I might start with Snow. 18:07:53 I note your site has had to make a number of 18:07:57 changes to how its services are delivered 18:07:59 to the public for stop if you can start off by 18:08:03 telling us a little more about your organisation, 18:08:06 the communities and what your response has been 18:08:09 to some of the issues that have cropped up 18:08:12 this year? 18:08:13 SNOW LI: We have been around for more than 30 18:08:17 years. 18:08:17 We have always been a key organisation 18:08:20 providing support to people with disability 18:08:22 who are aged and who are experiencing mental 18:08:25 illness. 18:08:25 Being a service provider in the northern Sydney 18:08:28 region for that long time, we have supported 18:08:31 lots of families through their life stages. 18:08:33 This is quite unique, as it is for everyone else. 18:08:37 We started with the bushfires and quickly 18:08:39 followed by COVID. 18:08:41 As an organisation, we have always planted our 18:08:43 technology implementation plan for 18:08:45 the organisation. 18:08:46 We had a roadmap, what we would implement in 18:08:49 technology to help people better access 18:08:51 care. 18:08:52 What happened this year, we had to fast track 18:08:55 things and it really change the assumptions 18:08:58 were used to have. 18:08:59 In the care sector, there has or has been 18:09:02 this assumption that people's disability and 18:09:05 people who are aged are not tech savvy.. 18:09:08 That is or is been the discussion every time 18:09:11 people bring it up, there is or is a strong 18:09:14 backlash of we are not catering for everybody 18:09:17 if we start to engage with technological 18:09:19 solutions. 18:09:20 With COVID, moving everything to Telehealth 18:09:22 helped us to 18:09:26 fast track how quickly people got into 18:09:29 technology. 18:09:29 And their assumptions about technology were 18:09:31 plainly wrong compared to what we assumed at 18:09:34 the beginning. 18:09:35 Having said that, we do see a different 18:09:38 adaptation process 18:09:39 of all the clients. 18:09:40 For us, it was a slow transition process 18:09:43 within the organisation 18:09:47 confronting some of the assumptions that we had, 18:09:50 but also helping clients individually to get onto 18:09:53 technology. 18:09:54 SHANTHI ROBERTSON: 18:10:00 LINDA McKINNEY: , Can you tell us more about 18:10:03 the work Microsoft is doing with it AI for 18:10:07 accessibility program 18:10:10 and other services for disability? 18:10:12 Particularly if you have any feedback from the 18:10:14 field about how effective some of these 18:10:17 have been during covert? 18:10:19 LIDA GHAHREMANLOU: 18:10:22 Most people know Microsoft is 18:10:24 traditionally a software company, but since 2014, 18:10:26 Microsoft has become one of the bigger clad 18:10:29 providers. 18:10:30 -- Cloud providers. 18:10:34 Their role has changed significantly and it 18:10:36 comes 18:10:37 within the culture of the company as well. 18:10:42 In terms of AI and automation systems, 18:10:45 we have a movement 18:10:49 in our culture 18:10:53 to provide to end users 18:10:57 that make AI accessible for everyone. 18:11:00 Always the culture 18:11:02 of accessibility 18:11:04 has been in a Microsoft 18:11:07 mainly around 18:11:09 the late 90s 18:11:10 , 18:11:12 we implement at the culture 18:11:15 into the hiring 18:11:16 process 18:11:18 and in recent years 18:11:21 they have an initiative program 18:11:23 , 18:11:24 like I 18:11:26 a 18:11:27 AI for good 18:11:28 . 18:11:29 In that 18:11:31 killer shift 18:11:32 in the company, 18:11:36 they have 18:11:40 moved into how software and culture and products 18:11:42 can change towards 18:11:45 making AI accessible for everyone. 18:11:51 It doesn't exclude any particular minority 18:11:54 and trying to include all people 18:11:57 . 18:12:07 As many big tech companies 18:12:09 once started, 18:12:12 they have immediately provided 18:12:18 through software technology, the company 18:12:19 has responded with 18:12:20 a load of 18:12:26 cloud providing access for different customers 18:12:28 as well as 18:12:29 minority people 18:12:31 who may not be able to use 18:12:36 , for example, 18:12:37 Teams 18:12:39 which has been 18:12:42 increased for people with particular needs. 18:12:48 In terms of vaccine development, Microsoft 18:12:50 was one of the 18:12:51 earliest 18:12:54 sponsor 18:12:54 that 18:12:57 made the 18:13:00 dataset about COVID available 18:13:01 through the 18:13:03 open 18:13:04 dataset. 18:13:09 Many 18:13:11 use that dataset 18:13:13 to start some 18:13:14 ML 18:13:16 competition 18:13:21 to answer the questions clients were answering 18:13:25 asking to develop the caps 18:13:28 vaccine. 18:13:31 There has been major 18:13:34 help through the technology 18:13:36 for companies that 18:13:37 are developing 18:13:41 their own vaccines 18:13:43 . 18:13:45 There are a number of them 18:13:48 that are quite close to a result. 18:13:51 I guess that's the way Microsoft 18:13:54 got united. 18:13:56 There has also been a special request 18:14:00 , consideration, 18:14:02 full 18:14:04 employees 18:14:06 with 18:14:09 needs, who have special circumstances 18:14:12 , to the management and through 18:14:14 the different 18:14:25 advisors, they make sure that every employee 18:14:27 can work at home 18:14:29 and 18:14:32 have an easy to work from environment at home. 18:14:37 SHANTHI ROBERTSON: 18:14:41 One of the fantastic things that is coming 18:14:44 through so far as we have 18:14:46 panellists from really diverse organisations 18:15:01 ranging from service delivery to big tech. 18:15:03 The next thing I wanted to ask is an issue which 18:15:07 I think is probably common across all 18:15:09 organisations in this space, regardless of 18:15:14 Snowbank 18:15:18 SNOW LI: 18:15:30 We are moving a large cohort of people, so people who are age, 18:15:34 people who have disabilities and people from different cultural 18:15:37 backgrounds and we realised everyone has a different perspective. 18:15:39 It depends on the age, cultural background and the disability and 18:15:43 the capacity they have with trust and sharing information online 18:15:46 and communicating with everyone online, 18:15:51 so I think in the previous workshop I touched on how different cultures 18:15:55 perceive trust different with privacy. 18:15:56 They believe 18:15:59 sharing a private document with their peer group is completely fine 18:16:03 . 18:16:03 When there is distrust 18:16:09 with the government, it becomes more problematic so when we are 18:16:12 collecting data from people online, one question they often ask is 18:16:16 how are we going to use the data? 18:16:19 Will it affect my children's 18:16:21 future in 20 years? 18:16:22 It 18:16:24 involves a lot of distrust and depends on the social and demographic 18:16:29 profile, so for us it is to engage 18:16:32 a very individualised approach 18:16:36 when talking about trust and sharing information online and 18:16:39 communicating online. 18:16:40 With the older generation they prefer a phone call, 18:16:51 more traditional means when it comes to email. 18:16:54 Sometimes we will have a backlash – why do you have my email? 18:16:59 It is a very interesting topic and something we have come across and 18:17:03 have decided to take a very (inaudible) approach and technology 18:17:07 with that particular intersection. 18:17:09 Having said that, 18:17:12 COVID definitely changed the landscape and people's approach to 18:17:16 technology because like it or not you have to rely on technology and 18:17:21 that includes people with disability relying on telly health for very 18:17:24 private sessions. 18:17:25 Some session involves behaviour support when seeing a person, they 18:17:28 are on the screen, how do you build trust from the screen? 18:17:33 Then we quickly have to learn from that and telling our therapist 18:17:38 , ask the person with disability the likes and 18:17:40 dislikes 18:17:43 and build trust through other means. 18:17:45 You can have a background of super heroes 18:17:48 if the children with disability are passionate about superhero so we have 18:17:52 two use different ways to 18:17:54 build trust 18:17:55 before the service 18:17:57 delivery session so this period will help us come up with creative 18:18:02 solutions in tackling trust and privacy issues in this 18:18:15 ? 18:18:18 space. 18:18:21 LELA KOGBARA: Microsoft is built on trust 18:18:23 especially because the role we have 18:18:25 is a big 18:18:28 tech company in the society as a cloud provider 18:18:33 . 18:18:36 It is very important and we have this message 18:18:39 , a slogan through the company that we are built on trust so 18:18:44 we with the 18:18:45 -- with the technology of AI because the breakthrough especially 18:18:49 for the last New Year's has been significant and 18:18:52 speech to text 18:18:54 , cognitive services 18:19:02 and other breakthroughs that come through to help the automation AI. 18:19:06 -- 18:19:06 View. 18:19:07 The question has been asked 18:19:10 next to the extent, what about the transparency and privacy? 18:19:17 There have been 18:19:19 stories that I am sure the audience and yourself no 18:19:22 house some 18:19:25 of this privacy has been broken down through the services 18:19:29 , through the other major big tech companies because 18:19:32 there 18:19:35 has not been regulation, how to use that technology 18:19:38 in terms of 18:19:39 not too kind of 18:19:43 invade the privacy 18:19:45 of individuals, 18:19:47 so because of that 18:19:53 , we have next to AI for accessibility and all the AI 18:19:57 breakthroughs and the tech companies have set their goals for, we have 18:20:02 responsible AI, which is 18:20:05 based on six major principles and 18:20:08 includes fairness 18:20:11 , reliability and safety 18:20:12 , accountability 18:20:15 , privacy 18:20:22 and transparency and in each of these principles we are looking into 18:20:27 a specific aspect of trust 18:20:30 and how our product can be fair 18:20:35 and not having the bias 18:20:37 , even if the data has bias 18:20:42 , how the product and software is able to identify that bias 18:20:46 through the data 18:20:52 and through the transparency pillar we have a particular software, open 18:20:56 source that helps the scientists 18:20:58 to be able to identify 18:21:01 and understand 18:21:03 the results of an 18:21:09 ML model, an AI model that would identify again if there is any bias 18:21:14 , if there is something 18:21:17 we are not understanding, what is that and it is also about 18:21:21 looking into other accountability 18:21:23 and security 18:21:25 … We are looking at that 18:21:27 . 18:21:28 Responsible AI 18:21:29 is not 18:21:32 a responsibility for individuals, it is across people 18:21:36 , government 18:21:39 and also the societies so it is important that 18:21:42 this concept is shared 18:21:45 not only by technology, not only by 18:21:47 individuals but also by 18:21:51 big government and through this practice 18:21:54 , we also have our 18:21:57 committee that make sure 18:22:00 the regulation comes through the software 18:22:02 and any new product that we make available 18:22:06 to the public follows all these principles I guess this way 18:22:10 the company is trying to stay on top of the game 18:22:14 and be as transparent as they can be. 18:22:19 SHANTHI ROBERTSON: 18:22:26 Responsible AI is a whole of society approach we are looking 18:22:30 towards at this moment. 18:22:34 I might throw to you on this idea of trust. 18:22:38 You might have a macro perspective on how trust might be working across 18:22:42 different industries. 18:22:43 LIAM MAGEE: There has been an interesting transition in relation 18:22:47 to how trust is in a sense 18:22:49 fixed 18:22:52 on particular organisations. 18:22:55 Have said decade or more ago there 18:23:00 was greater concern around public institutions is specifically 18:23:05 governments and the use of data of citizens 18:23:08 , for instance and it is only been quite recently 18:23:19 major sort of scandals 18:23:20 where trust has been breached 18:23:23 or the perceived bond of trust between company and consumer has 18:23:27 been breached 18:23:28 , so that I think Bach's one transition. 18:23:31 I think this year there is in a sense another kind of change underway 18:23:36 , 18:23:36 which often 18:23:40 … Trust has been spoken about 18:23:42 in relation to 18:23:45 its market or consumers 18:23:49 . 18:23:51 Marks. 18:23:54 Technology may not have 18:23:56 the kinds of 18:23:58 science 18:24:02 , technology's may not have the kinds of remedies 18:24:05 to particular crises and I think that has probably been 18:24:10 talked about a lot in environmental circles for a number of years 18:24:14 but in many other kinds of crises that are emerging now 18:24:18 , it is not always a kind of 18:24:21 situation 18:24:23 that science and technology, whether through companies of 18:24:27 public institutions can come in and solve 18:24:30 problems effectively and I think that particular shift is one that 18:24:35 is quite challenging 18:24:37 to process, both from within companies themselves but also 18:24:40 within the broader public 18:24:44 because it suggests there is a limit to the degree of control 18:24:53 that socially we can exercise over the emergence of crises and of 18:24:58 course COVID is the most prominent at the moment but I think the same 18:25:02 sort of situation around in a sense trust, relates to how 18:25:06 environmental crises like bushfires as I mentioned earlier 18:25:09 are dealt with 18:25:10 , and a final point 18:25:12 – I think it is also important to note as 18:25:16 Lida has done in a way the extent to which 18:25:24 trust has become a commodity for many organisations that not only do 18:25:28 they pride themselves on a trusted relationship with customers. 18:25:31 I think that concept has been around a long time, but it is something, 18:25:36 almost a kind of property (inaudible) and just to refer to one 18:25:40 incident that is evidence of this, immediately after the Cambridge 18:25:44 Analytic 18:25:50 scandal there was a moment where Tim Cook, the CEO of Apple 18:25:54 criticised Facebook and contrast of the way Apple dealt with privacy and 18:25:59 security for its customers compared with Facebook, so it clearly becomes 18:26:03 itself a commodity to differentiate one company from another. 18:26:07 SHANTHI ROBERTSON: 18:26:19 Corporatisation of trust. 18:26:20 If we think about trust, one-dimension is the really 18:26:23 authentic inclusion of technology uses 18:26:28 and in the design of testing and tech products and for the work we're 18:26:33 doing this includes users with disability and you all have 18:26:37 different opinions and experiences on this so I might want to explore 18:26:41 that idea a little more about the idea of codesign. 18:26:47 Snow, perhaps starting with you because you have been involved in a 18:26:51 start-up venture that has codesign built into it. 18:26:54 What you understand about codesign and what can it do to address some 18:26:59 issues around technology? 18:27:00 SNOW LI: In response to the crisis we have seen in the disability 18:27:05 sector a lot of players enter the space being the start-up companies 18:27:09 and they are able to adapt than environment quite well because they 18:27:13 are agile with respect to products and product design but in this 18:27:17 process, it is a rapid moving process and heavily involves them 18:27:21 validating their idea, prototyping the idea and 18:27:24 testing with the users, so we always talk about codesign in 18:27:27 services but I guess that a company because they want to be user 18:27:32 friendly product -- build. 18:27:34 The codesign process is well done 18:27:40 -- known so I appreciate this process and it could have a more 18:27:45 personalised solution for people with different needs and we are now 18:27:49 seeing more of these players in the disability space. 18:27:52 At the same time, we are working with some of the disability start 18:27:56 ups and being involved, I can quickly see how it can also be a 18:28:01 problem if it is not trained and done well. 18:28:05 One example is one of the software engineers was very curious about 18:28:09 designing a solution so passionate about the space and wanting to go 18:28:13 out and do some user testing himself. 18:28:15 Without talking to me, so he went into the field and got\straightaway, 18:28:19 so basically, in the disability sector we came along way. 18:28:23 We had the process, the revolution, the words we use, 18:28:26 have an understanding of what the service is about, what it means and 18:28:31 that is something that happens in the last 10 years, so continuously 18:28:35 we evolve the words that we use in the service context the goods we 18:28:40 know we want to be more person centred and remove the stigma from 18:28:44 the way we speak. 18:28:46 But when the software engineer went out and did the units are testing 18:28:51 himself he was using the wrong words. 18:28:53 -- User testing. 18:28:55 He was not from the sector. 18:28:57 Coming back to the trust issues where trust is very much valued in 18:29:02 the sector 18:29:09 . 18:29:10 Once they interact with you, they find out you understand the sector, 18:29:14 have been in the sector long enough and the trust was built instantly 18:29:20 and for him using the wrong words talking about technology, it shuts 18:29:23 people down, so when we are talking about tech technology and design, it 18:29:27 is a methodology but at the same time you need to understand the 18:29:32 sector and how to engage to get the most out of it otherwise you will 18:29:37 not get anything, so I'm glad you brought the codesign path up and 18:29:42 this is a very interesting area and I think that… I see that in the 18:29:47 future we need to meet somewhere in the middle, need to be some sort of 18:29:52 protocols or ways that can help this technology… The players to better 18:29:56 engage with people with disability with a range of different 18:30:00 conditions. 18:30:01 SHANTHI ROBERTSON: I think that is a critical point and that is hopefully 18:30:05 something as a community of practice that we can build some of those 18:30:10 connections between tech users and the service sector as well. 18:30:12 scale. 18:30:12 That is the issue of trust, which often comes 18:30:16 up is very key in these discussions. 18:30:18 I might throw to you again, Snow, to talk 18:30:21 about trust in your contacts and what that 18:30:24 means for you in your role. 18:30:26 LIAM MAGEE: I think in a research perspective, 18:30:29 co-design is a powerful instrument. 18:30:31 Precisely because it serves to break down a 18:30:33 lot of barriers that often exist between 18:30:36 researchers and 18:30:37 participants. 18:30:38 In doing so, I think it can help shift certain 18:30:41 relations of power. 18:30:43 If you think about traditional research, 18:30:45 the researcher frames the research, ask silly 18:30:47 questions of the participants, they 18:30:49 respond with answers and the researcher takes 18:30:51 those answers away and analyses and comes up 18:30:54 with areas types of conclusions. 18:30:56 That general model doesn't disappear 18:30:58 entirely, 18:31:01 but there are processes by which participants 18:31:03 can actively get involved in research 18:31:05 in a sort of analogous 18:31:07 way to technology is an important way 18:31:10 of destabilising 18:31:13 existing structures of power. 18:31:17 As you have alluded to, 18:31:20 I think it's been very effective in the search 18:31:24 contacts 18:31:26 context 18:31:28 to deal with products that aim to 18:31:32 incorporate users into a kind of product, 18:31:34 whether a game or another type of 18:31:37 technology. 18:31:38 I want to note the co-design 18:31:40 is also referenced 18:31:42 , raises problems of its own. 18:31:46 There are questions around who gets invited 18:31:48 to participate in co-design 18:31:51 workshops 18:31:54 , how accessible those sorts of events might be 18:31:57 . 18:31:58 Particularly in the contexts 18:32:00 I have worked in, how co-design decisions can 18:32:03 often be overridden in later phases of 18:32:05 development. 18:32:06 They can be difficult to circumvent 18:32:09 because often the reasons for overwriting 18:32:11 are to do with 18:32:12 costs 18:32:15 and time blowouts and so on. 18:32:18 Co-design has its own, 18:32:23 it can be an expensive process, or it came up 18:32:26 with particular ideas that are difficult to 18:32:29 realise. 18:32:29 Finally, I think in the context 18:32:35 that interests us in our project around 18:32:37 automation and artificial intelligence, 18:32:39 it's a really difficult concept to know how to 18:32:42 integrate in a fundamental way 18:32:45 precisely because AI systems are already 18:32:47 highly complex. 18:32:48 Even the engineers involved in the design 18:32:51 are often not entirely sure 18:32:52 of how an AI system produces the results it 18:32:55 does. 18:33:01 It opens than a lot of questions around 18:33:04 advanced technology systems, about exactly 18:33:05 how co-design will fit in. 18:33:07 Nonetheless, there are certainly roles to play 18:33:10 in areas like 18:33:11 selecting training data, designing the 18:33:12 architecture of AI systems, evaluating 18:33:14 results and particularly in the government's use. 18:33:21 Perhaps to turn to Lida to take this 18:33:23 conversation forward, it would be interesting to 18:33:26 hear from you about ways in which 18:33:28 customers and other kinds of stakeholders 18:33:30 are able to get involved in AI design 18:33:33 in a spirit of co-design. 18:33:36 Lida Ghahremanlou Mac 18:33:37 \ 18:33:41 LIDA GHAHREMANLOU: 18:33:45 I like the way you are looking at it from a 18:33:49 research perspective. 18:33:52 That is completely the right conclusion that 18:33:54 you have got. 18:33:55 Also, I just want to make a point about 18:33:58 Snow's comment about what care means. 18:34:01 It is something 18:34:02 , 18:34:03 when we decided 18:34:04 a big 18:34:07 tech company design and started doing 18:34:09 this AI product, 18:34:10 that is the question 18:34:12 they 18:34:13 asked themselves later on. 18:34:15 What care means. 18:34:18 There has been 18:34:20 a transfer 18:34:22 from the design perspective 18:34:24 from the technology perspective 18:34:26 to the care perspective. 18:34:27 I can see 18:34:29 the shift has been significant 18:34:32 and obvious 18:34:33 that 18:34:34 now 18:34:38 the inclusive 18:34:39 design 18:34:42 coming at the centre of the decision and 18:34:45 principles 18:34:45 that I discussed. 18:34:49 One of the 18:34:50 ways 18:34:52 that the company has shifted 18:34:54 is providing the employee 18:34:56 with mandatory training 18:34:59 on accessibility 18:35:02 for AI. 18:35:03 Meaning 18:35:06 we are supposed to take this online training 18:35:09 that we know 18:35:10 how to 18:35:16 refer and what terminology to use, not 18:35:18 just the technology and not just AI 18:35:21 kind of way that we always speak. 18:35:24 That has become 18:35:25 a 18:35:25 core training 18:35:28 part for us 18:35:31 for every employee is essential. 18:35:34 The other thing, 18:35:38 through the responsibility 18:35:40 that I mentioned 18:35:43 of six principles that we have got, 18:35:45 we have practices 18:35:50 that seal our organisation, 18:35:53 making sure that when the software 18:35:55 has been produced 18:35:59 and is being 18:36:01 in the use of the end user, 18:36:03 does it follow all the essential principles 18:36:06 that 18:36:06 identify 18:36:08 responsible AI 18:36:12 and making sure that these follow those 18:36:14 regulations. 18:36:15 Through that practice, 18:36:17 they introduced tools 18:36:21 , something like 18:36:22 interpret 18:36:24 , those 18:36:25 are open source 18:36:28 software that they keep improving and we are 18:36:31 also learning how to use them 18:36:33 to train our 18:36:34 customers 18:36:36 and end users 18:36:39 , that when they use our software, 18:36:42 they make sure 18:36:43 to use these tools together and 18:36:45 not just 18:36:46 using, for example, 18:36:49 a learning environment 18:36:51 and just train the model 18:36:53 and see what the 18:36:54 technology 18:36:56 can do for them. 18:36:59 We have an expression 18:37:00 that 18:37:02 it is a 18:37:04 . 18:37:04 .. 18:37:08 At the end of a speech when we talk about the 18:37:11 technology, 18:37:12 we say one more thing. 18:37:15 Have you thought about the accessibility of 18:37:18 your products? 18:37:20 We include the six principles 18:37:22 that we have been putting in place? 18:37:25 What are your responses 18:37:27 to providing 18:37:30 the inclusive design 18:37:35 to your final product? 18:37:37 These are the questions that we have been 18:37:40 trained to ask as 18:37:41 delivery services that we put for our customers. 18:37:44 That is a major shift. 18:37:46 In the past, maybe 10 years ago, 18:37:50 this was not a concern or maybe just a side note 18:37:54 . 18:37:55 At the moment, 18:37:56 this is the core 18:37:58 for us. 18:38:03 Anytime we talk about the services and the 18:38:06 product. 18:38:07 That 18:38:09 tells a lot of promise 18:38:14 that we are designing for the end user 18:38:16 , 18:38:17 to the core of the conversation 18:38:19 . 18:38:20 Not just something as a side point. 18:38:23 I hope it answers the question. 18:38:30 SHANTHI ROBERTSON: Thank you all. 18:38:31 These are great insights. 18:38:33 I have one more question for the panel. 18:38:40 I would like to remind our audience members. 18:38:42 If you have questions for our panellists or 18:38:45 even a comment to add to the discussion, please 18:38:48 to type them into the Q&A and we will start 18:38:52 the interactive portion of addressing some of 18:38:54 your questions and comments very soon. 18:38:56 I thought we would just finish off with one more 18:39:00 question for the panel. 18:39:02 I think that's really about the range of 18:39:05 organisations that you will come from. 18:39:07 We have got Snow from a service provider that is 18:39:10 providing service delivery here in Sydney 18:39:12 to the community directly. 18:39:14 We have Lida representing a large 18:39:16 tech company. 18:39:17 And we have the academic research perspective as 18:39:20 well. 18:39:20 Each of these organisations have 18:39:22 different aims. 18:39:23 As we have addressed in the conversation 18:39:26 start-ups, they come from a framework of 18:39:28 being fast and agile, attributes that can 18:39:30 sometimes not work well with the slower pace 18:39:33 provided with inclusion and consultation of 18:39:35 co-design. 18:39:38 Other organisations have different 18:39:40 challenges to do with scale and size. 18:39:42 The university sector has challenges around a 18:39:49 disability and inclusion. 18:39:50 With the crisis we have been staring down in 18:39:53 2020, perhaps there is an increase in the 18:39:56 perceived need for speed, for things to 18:39:59 happen quickly in change and technological 18:40:01 transformation solutions to be rolled out 18:40:03 quickly. 18:40:04 I was wondering, does this sometimes perhaps 18:40:06 happen at the expense of some of these methods of 18:40:10 inclusion we have talked about today? 18:40:12 Snow, I will 18:40:14 throw to you first. 18:40:16 Do you have any thoughts on that crisis context 18:40:19 and the need for acceleration? 18:40:21 SNOW LI: For me, it is important to see that 18:40:24 there are two sides. 18:40:34 There are capacity and empowerment for the two 18:40:37 sides. 18:40:37 The service provider, they have more in-depth 18:40:40 understanding of how to support the person, so 18:40:43 that we do what's of handholding 18:40:45 support with our clients to transition 18:40:47 them through 18:40:47 , to help them understand technology, 18:40:50 to bring the shared understanding using 18:40:52 different methods of technology use 18:40:56 . 18:40:57 Understanding what sort of technology we are 18:40:59 using for our clients and how they perceive 18:41:02 the technology. 18:41:03 On the other side of the example given earlier, 18:41:06 the software engineer, 18:41:07 we need to consider the start-up plays in the 18:41:10 disability space. 18:41:11 Are they empowered enough? 18:41:13 Do they have a capacity to design solutions? 18:41:15 At the end of the day, 18:41:18 is it easy for the user to use them, 18:41:21 they want to have more efficient 18:41:23 and good co-design workshop 18:41:24 with the 18:41:25 rental and use of the product. 18:41:28 You do see a need 18:41:30 to see 18:41:35 the capacity building empowerment for both 18:41:37 sides of the party. 18:41:39 How do you bridge that? 18:41:41 What sort of FREYA: of concrete framework 18:41:44 mechanism 18:41:44 can be used to guide both sides to better 18:41:47 reach the middle point? 18:41:52 As you pointed out, traditional care 18:41:54 providers tend to have a very slow process, and 18:41:59 on the other side, the new players, the 18:42:01 start-ups, they are very forward thinking. 18:42:06 They are hearing the start-up structures. 18:42:08 Lots of money for the CTO 18:42:10 s 18:42:10 and money spent on the technology, 18:42:20 but when they talk about technology in the 18:42:23 care space, it is the people that matter. 18:42:26 They seem to be missing that point. 18:42:28 It is not a blanket statement. 18:42:30 I am hoping to see the middle point is bringing 18:42:34 together a group mechanism, and efficient 18:42:36 mechanism for the project. 18:42:37 SHANTHI ROBERTSON: Thanks. 18:42:38 Lida, what about you? 18:42:40 Do you have any thought about crisis times? 18:42:43 LIDA GHAHREMANLOU: Absolutely. 18:42:45 I total agree with all the comments 18:42:48 . 18:42:50 We realise 18:42:51 that there is a gap 18:42:53 between the technology 18:42:56 and the delivery services that we provide 18:42:59 . 18:43:02 One of the greatest practices that we are 18:43:05 trying to do 18:43:06 in the past few years, is 18:43:08 through the 18:43:11 programs 18:43:12 , 18:43:13 we worked 18:43:18 closely for six months with start-ups. 18:43:20 We bring them in, we help them to build 18:43:23 and we are here for what they are looking for 18:43:27 in terms of their service delivery. 18:43:29 We learn from them how to design 18:43:31 , 18:43:32 what the design needs to be 18:43:35 and what needs to be included 18:43:39 and excluded. 18:43:40 We bring our expertise with the 18:43:42 training and the knowledge 18:43:43 that the start-ups bring 18:43:45 and work 18:43:48 with them. 18:43:49 We help them to build this 18:43:51 in a prototype 18:43:52 kind of 18:43:54 product. 18:43:57 That practice for me, 18:44:00 I join 18:44:02 always AI for good 18:44:04 programs, practices 18:44:06 . 18:44:10 That practice for me is very valuable. 18:44:13 I get in touch with the actual 18:44:16 start-ups 18:44:18 and companies 18:44:21 who are doing the real design 18:44:23 . 18:44:25 We're just providing a platform for them. 18:44:28 This is a great shift 18:44:31 , let's say for 18:44:33 a big 18:44:35 thing, always 18:44:37 thinking to produce more 18:44:42 and just take the market for themselves. 18:44:45 This is a 18:44:49 way to make them stop, think and listen to 18:44:53 them. 18:44:53 Try to give them 18:44:55 ... 18:44:55 The 18:44:58 thing they think they need to design. 18:45:04 They have the data from 18:45:08 people with low vision. 18:45:11 There is a tracker they are trying to 18:45:16 design, it gives them the 18:45:21 location of longitude and latitude of that 18:45:23 person. 18:45:24 They are looking to be the prediction model 18:45:27 that can help them 18:45:28 , for example when the person is trying to turn 18:45:32 , doesn't match with the location 18:45:34 . 18:45:35 Is it a safe place to turn 18:45:38 two 18:45:38 ? 18:45:42 Liam, I thought I might throw back to you now about the idea of 18:45:47 codesign. 18:45:48 Do you have anything further you wanted to reflect on there? 18:45:57 For me, 18:45:58 that is a very valuable 18:46:00 working environment together 18:46:03 because I could immediately respond 18:46:06 with technology 18:46:08 and the services that we have 18:46:11 but I have somebody sitting there 18:46:13 working with 18:46:15 low vision users and they are telling me something that I 18:46:19 do not know through the data and I guess 18:46:22 these sorts of practices 18:46:23 , I think I want to conclude that these sorts of practices and 18:46:28 workshops that we run through the company give us the opportunity to 18:46:32 bring a codesign concept 18:46:33 to the core of our technology development and I believe this is 18:46:37 the vision for Microsoft and we are moving towards that 18:46:41 and I'm very happy to be able to use 18:46:44 the background and the technology that we know 18:46:47 for these very valuable reasons. 18:46:49 SHANTHI ROBERTSON: 18:46:56 We are keeping such a close eye on the work you are doing, Lida, to see 18:47:01 how this plays out. 18:47:03 Liam, I might get your thoughts, I know that we often grappling with the 18:47:16 tension between Capel and ethical processes 18:47:18 , so do you have anything 18:47:20 to add there? 18:47:23 LIAM MAGEE: 18:47:28 I was involved in the development of the agile software if you like and 18:47:33 what is interesting is reflecting 18:47:35 on ways of inclusion and codesign, participatory design is in certain 18:47:39 respects the agile movement 18:47:42 which made it popular or popularise this idea of moving fast 18:47:45 and doing things at speed 18:47:51 , was in terms of its detail really focused on slowing down that it 18:47:56 really brought the four listening 18:47:58 as Lida has mentioned, listening to customers or users of products, 18:48:02 building in testing early on and a process called peer programming 18:48:08 that would have people sitting side-by-side at one keyboard which 18:48:12 in theory would halve the rate at which software was developed 18:48:16 and yet, because this would 18:48:17 precisely kind of lead 18:48:24 to something a more reflexive process of software production, it 18:48:28 would be agile and more responsive, so I think sometimes we are forced 18:48:32 and I know the kind of questions we develop for this session put forward 18:48:37 this kind of dichotomy if you like between care and patience versus 18:48:41 moving fast 18:48:44 , but in certain respects, at least in technology, it is often the case 18:48:49 that agile means precisely taking time and taking time to be 18:48:53 inclusive, to involve people more systemically 18:48:55 physically in the disability space so I will conclude on that point. 18:48:59 SHANTHI ROBERTSON: 18:49:04 Thanks. 18:49:05 I'm not getting any questions and answers from the chat. 18:49:08 Don't be shy. 18:49:10 We still have a few minutes left in this webinar and we have SME people 18:49:15 -- smart people on the panel but no questions to big or small, so feel 18:49:21 free to type something in but I might take the liberty to keep the 18:49:26 conversation going. 18:49:27 I thought that scope quite macro 18:49:29 and think big picture 18:49:35 stuff, 18:49:37 so 18:49:42 for 2020, what has been the broad role of technology in relation to 18:49:46 the crisis we face this year? 18:49:49 I might start with you this time, Liam. 18:49:51 LIAM MAGEE: I think 18:49:55 to go back to a point are made earlier, one of the interesting 18:49:59 points relates to this trust and the sense in which 18:50:05 , I think in the very first instance, certainly not for people 18:50:09 who are medical practitioners or involved in medical research but I 18:50:13 think the public is probably surprised there was not a cure all 18:50:17 this did not resolve itself, COVID did not resolve itself 18:50:21 much more quickly and we were forced into these progressive 18:50:24 lockdowns, 18:50:27 the governments had hugely varying responses to the crisis that there 18:50:31 was no global consensus about ways to kind of deal with it and even 18:50:36 expertise 18:50:37 seemed to be split often although clearly some kind of consensus has 18:50:41 emerged, and I think that 18:50:43 , in a certain sense, is profoundly shocking in the sense that there is 18:50:48 no standard way in which 18:50:50 crises can be analysed 18:50:54 , rationalised, developed into a programme of operation that is then 18:50:58 carried through in ways that are universally 18:51:03 recognised and of course that has always been a myth but I think the 18:51:08 extent to which countries and even within countries communities have 18:51:11 been deeply divided by 18:51:13 how best to respond to crises, how even to interpret it 18:51:22 , points to a deep fundamental questioning around how we look@is, 18:51:26 technology and how we understand its role in determining what courses of 18:51:30 action we should take and also perhaps some surprise that 18:51:33 technology has not been… Although 18:51:38 as Lida mentioned it has been critical in the work of companies 18:51:42 like Microsoft, critical in gathering data sets and assisting 18:51:46 drug and pharmaceutical companies in looking at vaccines, things like 18:51:49 contact tracing have been very 18:51:51 hit and miss, I think, interns of the efficacy from country to country 18:51:55 . 18:51:56 Conversely, the important role of technology that 18:51:58 has been played 18:52:08 out has been around mundane functions around communication, 18:52:10 entertainment and facilitating remote work and session such as this. 18:52:14 SHANTHI ROBERTSON: 18:52:15 Snow, how about you, did you have any big picture 18:52:21 reflections on technology and the 2020 pandemic? 18:52:23 SNOW LI: 18:52:24 How adaptable we are, so 18:52:26 helping lots of clients transition 18:52:30 . 18:52:31 I do not know what Zoom is, I cannot set it up. 18:52:35 But to the stage where you see everyone 18:52:41 interacting on the screen, we take people on virtual tours of things 18:52:45 and there work looking lessons, 18:52:47 the connection through technology, something if I talk about that, 18:52:50 before COVID,, "No, there is no way you can connect 18:52:55 through a computer screen 18:52:57 ," 18:52:59 to how quickly you can see people doing exercise 18:53:03 , to cooking in the kitchen, everyone sharing the same 18:53:06 recipes 18:53:07 . 18:53:09 " 18:53:12 This is something we made. 18:53:14 It looks the same." 18:53:15 It also confided my belief that only close contact is important. 18:53:19 People can quickly connect with each other and adapt 18:53:23 to their environment 18:53:27 , old or not. 18:53:29 The craving for connection can still easily be facilitated through 18:53:33 technology and how important it is to see them smiling in front of 18:53:37 cameras and seeing the impact straightaway through the computer 18:53:41 screen, so I think that it is again 18:53:43 so many assumptions were made in the sector about technology use and 18:53:47 how engaged with technology and sessions, tele-health 18:53:50 and those barriers quickly got 18:53:55 dissolved two or three months into COVID, so again, for me, the biggest 18:54:00 lesson is how adaptable we are in how we can better use live technology 18:54:05 for good things. 18:54:06 SHANTHI ROBERTSON: Thanks 18:54:08 . 18:54:09 That is a good thought. 18:54:11 We have been pretty adaptable. 18:54:13 Lida, what about you? 18:54:15 You are about to embark on a second lockdown in the UK. 18:54:19 Do you have any reflections on the broad role of technology 18:54:23 in what we have all been through this year? 18:54:26 LIDA GHAHREMANLOU: I guess 18:54:28 I 18:54:29 agree with Liam 18:54:34 's,. 18:54:35 I agree with Snow we got adaptable very fast. 18:54:38 I guess 18:54:39 people like me who work with this 18:54:41 high-technology 18:54:43 , 18:54:46 which is like immediately the companies provided us with a lot of 18:54:51 facilities like if you needed a second monitor, they could send it 18:54:55 to us 18:54:57 . 18:54:58 The features they added to Zoom or to Teams 18:55:01 and we also only 18:55:03 got set up and we all had an office at home 18:55:07 and that is a great part 18:55:09 – I agree on that 18:55:11 – but I think we kind of missed out on some of the comments 18:55:16 , some of the other aspects of working from home 18:55:20 and dealing with crises and 18:55:22 you only notice this 18:55:23 when you put it 18:55:24 into this perspective 18:55:26 with other people. 18:55:28 Yesterday 18:55:30 in our tech holder, which we used to get 18:55:33 (inaudible) 18:55:36 and I find that screen time is so overwhelming. 18:55:39 You have to be 18:55:49 in front of a screen from 8 AM to 5 or 6 PM and often working hours 18:55:54 continue. 18:55:55 You don't get a proper lunch breaks of these are all the disadvantages 18:56:00 of this adaptability that we have. 18:56:02 But we had a colleague with a disability on the call and she was 18:56:07 talking about her own experience working from home 18:56:10 and technology, the fact she has a similar environment like us 18:56:14 , and I was just 18:56:17 … She mentioned because of her condition 18:56:20 , she could not leave the house 18:56:25 since March because she needed volunteers to help her to leave the 18:56:29 house and because of the extensive time 18:56:32 , like three months of lockdown 18:56:34 in early spring 18:56:35 and it just crossed my mind 18:56:38 how we notice 18:56:44 these minority people and how our technology that we claim to have 18:56:50 a lot of breakthroughs 18:56:54 like speech to text and text-to-speech and vision, all the 18:56:58 services with vision (inaudible) the NHS tracking 18:57:00 kind of software 18:57:03 being broadly used in the UK 18:57:08 , has been used to register and I just felt that 18:57:13 it is a great picture 18:57:14 out there but 18:57:17 there are a lot of 18:57:22 small holes that we do not see unless somebody puts it into 18:57:26 perspective that, "You forgot about this and that," 18:57:29 and I guess you cannot blame the technology 18:57:34 in a broader perspective because everyone hundred years 18:57:37 human societies faces 18:57:39 these is 18:57:45 -- disasters. 18:57:48 At a college in London, they published 18:57:50 a report that 18:57:53 we will deal with this crisis in 18 months 18:57:56 and I remember when I read that report 18:57:58 I thought it was a joke. 18:58:02 We are going to get through it in six months. 18:58:06 Our society would not… 18:58:08 But we are nearly 12 months now and we still haven't found a vaccine 18:58:18 and the governments are not putting the right measures in place and still 18:58:22 dealing with it, so I guess a lot of learning through this crisis 18:58:27 for individuals 18:58:27 , for organisations and for government 18:58:30 as well and the best way is 18:58:35 just to put it into perspective 18:58:45 . 18:58:45 I would say that is the main thing – not only looking at our own 18:58:51 perspective, looking at different perspectives and then you get the 18:58:54 right measure to see how we deal with crises 18:58:57 from a global perspective. 18:58:59 SHANTHI ROBERTSON: A very important point to finish on. 18:59:02 We have learned to lock there are gaps to fill and we have to thinking 18:59:07 about it from the perspective other people living in other contexts and 18:59:12 circumstances. 18:59:12 We do need to wrap this up. 18:59:15 I want to thank all of you for participating. 18:59:18 Thank you so much, our three incredible panellists, for 18:59:21 those insights and 18:59:49 the interesting conversation, thank you to the Auslan interpreters for 18:59:52 your support. 18:59:53 Before you go, I would like to remind everyone we have one more 18:59:58 webinar in the series this week on Tuesday, 17 November from 2 PM to 3 19:00:03 PM, as slightly different topic next week, engaging our audiences online, 19:00:07 reputation, diversity and disability in the art and creative industries so 19:00:11 watch out for info about that. 19:00:13 We hope you can join us and thanks so much, everyone, for being here. 19:00:18 It was a really fruitful discussion. 19:00:21 LIDA GHAHREMANLOU: Thank you. 19:00:22 LIAM MAGEE: Thank you. 19:00:24 Live captioning by Ai-Media