AI in Workforce Readiness

 Listen to Audio

Audio file

 

Listen Anywhere

March 2026

In this episode, Susan Szathmary and Richard Jaenisch, both of Open BioPharma Research and Training Institute, join the podcast to share how to accelerate the adoption of new technologies through applied AI in pharma manufacturing and for workforce readiness.

  • Guest

    Placeholder Person Graphic
    Susan Szathmary
    Executive Director
    Open Biopharma Training Institute
    Richard Jaenisch
    Senior Director of Education, Outreach and Digital Experience
    Open Biopharma Research and Training Institute
  • Transcript

    Back to Top

    Download Transcripts

    1

    00:00:00,080 --> 00:00:10,080

    Welcome to the ISPE podcast, shaping the future of pharma, where ISPE supports you on your journey, fueling innovation, sharing insights, thought

    2

    00:00:10,080 --> 00:00:14,240

    leadership, and empowering a global community to reimagine what's possible.

    3

    00:00:15,414 --> 00:00:20,135

    Welcome to the ISPE podcast, shaping the future of pharma.

    4

    00:00:20,535 --> 00:00:22,375

    I'm Bob Chew, your host.

    5

    00:00:22,454 --> 00:00:32,609

    And today, we have another episode where we'll be sharing the latest insights and thought leadership on manufacturing, technology, supply chains and regulatory

    6

    00:00:32,609 --> 00:00:36,049

    trends impacting the pharmaceutical industry.

    7

    00:00:36,609 --> 00:00:43,754

    You will hear directly from the innovators, experts and professionals driving progress and shaping the future.

    8

    00:00:44,234 --> 00:00:45,914

    Thank you again for joining us.

    9

    00:00:46,155 --> 00:00:48,715

    And now let's dive into this episode.

    10

    00:00:49,354 --> 00:00:58,679

    Our topic today is the use of applied AI in pharma manufacturing in general and workforce readiness in particular.

    11

    00:00:59,159 --> 00:01:09,239

    To share more about this topic, I would like to welcome Susan Szathmary and Richard Jaenisch, both of Open Biopharma Research and Training Institute.

    12

    00:01:09,799 --> 00:01:13,224

    Welcome to this podcast, shaping the future of pharma.

    13

    00:01:13,784 --> 00:01:14,825

    Thank you for having us.

    14

    00:01:14,825 --> 00:01:16,105

    Thank you for having us.

    15

    00:01:17,465 --> 00:01:22,745

    First, please tell me about Open Biopharma Research and Training Institute.

    16

    00:01:22,745 --> 00:01:26,905

    How did it get started, and what is its scope of services today?

    17

    00:01:28,740 --> 00:01:32,819

    Open Biopharma started in in 2020.

    18

    00:01:33,299 --> 00:01:34,740

    It's a brand new institute.

    19

    00:01:34,740 --> 00:01:42,100

    We have about 45,000 square feet of facility with 6,000 square foot of open clean room space.

    20

    00:01:42,605 --> 00:01:52,924

    And the idea was to provide training in on project in on the job training, focusing on human error reduction

    21

    00:01:53,325 --> 00:01:54,844

    and focusing on The U.

    22

    00:01:54,844 --> 00:01:55,084

    S.

    23

    00:01:55,084 --> 00:01:55,564

    Technologies.

    24

    00:01:56,020 --> 00:02:02,819

    So students and employees looking for new job or riskier or upskill, they can find training on The U.

    25

    00:02:02,819 --> 00:02:03,140

    S.

    26

    00:02:03,140 --> 00:02:03,780

    Technologies.

    27

    00:02:03,780 --> 00:02:07,459

    So this is how we incorporated AI also into our training.

    28

    00:02:07,459 --> 00:02:10,099

    All of our faculty are all coming from the industry.

    29

    00:02:10,099 --> 00:02:12,675

    So they worked at CDMO's pharma companies.

    30

    00:02:13,155 --> 00:02:17,955

    And we also have a great mentor group who are working in pharma.

    31

    00:02:18,275 --> 00:02:23,635

    So one of our major focus is advanced technologies too and advanced therapies.

    32

    00:02:23,635 --> 00:02:28,050

    So we focus on cell therapy and cell and gene therapy also.

    33

    00:02:28,689 --> 00:02:32,689

    And that's basically the core of our things.

    34

    00:02:32,689 --> 00:02:35,810

    It's more like a residency type of training.

    35

    00:02:35,810 --> 00:02:42,370

    We're not the school, but more like the next step, like a residency to get somebody ready for critical manufacturing steps.

    36

    00:02:42,504 --> 00:02:45,465

    We like to do that bridge between, you know, academia and industries.

    37

    00:02:45,465 --> 00:02:46,985

    And there's often a gap between them.

    38

    00:02:46,985 --> 00:02:56,585

    We wanna make sure that, you know, there are folks who are coming out of the coming out of these academic institutions who are right ready to work, ready to and and not just ready to work, like, ready to

    39

    00:02:56,585 --> 00:02:57,705

    work at a at a certain level.

    40

    00:02:58,349 --> 00:03:06,110

    And so, you know, one of the goals of our of our nonprofit, you because Open Biopharma Research and Training Institute is a nonprofit training institute.

    41

    00:03:06,430 --> 00:03:10,030

    And we also have, you know, we have a we have a core lab services.

    42

    00:03:10,030 --> 00:03:17,405

    We have our, you know, our polypractical production space, and we have our our training services that we we offer, and we also host as a venue as well.

    43

    00:03:17,405 --> 00:03:26,365

    So we have often a host, leading training services and training come training from different organizations who are have industry ready training.

    44

    00:03:26,580 --> 00:03:34,740

    And though we often bring do we often work deals with them to have a a channel for all of our, apprentices and interns to be able to access that training.

    45

    00:03:34,740 --> 00:03:38,180

    So this way, then when they come out, they're that's how we say when they're industry ready.

    46

    00:03:38,180 --> 00:03:47,215

    They're they're able to do all of these other things and actually try out these different elements, work on the core lab services, and really get a real well rounded, level of experience to understand where

    47

    00:03:47,215 --> 00:03:48,735

    they best fit.

    48

    00:03:48,735 --> 00:03:52,895

    Because oftentimes, it can be difficult when you're in college trying to figure out exactly what you wanna do.

    49

    00:03:52,895 --> 00:03:56,334

    And then that never really matches with whatever industry is about to do because it always changes.

    50

    00:03:56,400 --> 00:03:59,919

    You know, two years, five years, that's difficult for these things to work.

    51

    00:03:59,919 --> 00:04:02,080

    So that's Open Biopharma in a nutshell.

    52

    00:04:02,719 --> 00:04:08,000

    And, yeah, we're we're happy to be here, and it's a great it's a great organization.

    53

    00:04:08,000 --> 00:04:08,479

    Thank you.

    54

    00:04:08,479 --> 00:04:15,974

    So I've visited your facilities, which are located, what, in Carlsbad, North Of San Diego.

    55

    00:04:16,454 --> 00:04:27,220

    Maybe tell our audience a little bit more about what you've got in the labs and in the development or production spaces?

    56

    00:04:28,740 --> 00:04:37,584

    So we have clean room space, and we're focused on having equipment like single use bioreactors, chromatography systems.

    57

    00:04:37,584 --> 00:04:47,584

    So the very basic ones, everybody can train on on the basic elements, can train on how to connect them, how to gone, how to do environment monitoring in a

    58

    00:04:48,029 --> 00:04:49,470

    in a GMP setting.

    59

    00:04:49,470 --> 00:05:00,875

    And then we also have a lot of quality control type of equipment in our facilities, so flow cytometers, the absorptors, sequencer, PCR

    60

    00:05:00,875 --> 00:05:03,754

    machines, you know, chromatography.

    61

    00:05:03,754 --> 00:05:11,754

    And then we're also working on new technologies, for example, for cell and gene therapy, how we can liquid biopsy, their culture Yeah.

    62

    00:05:12,089 --> 00:05:21,290

    During production and get more information what's going in the cells so we can characterize EVs that coming out of cells and live cells and work on those.

    63

    00:05:21,290 --> 00:05:27,685

    And the students are basically exposed to all the technologies, and we have a rotation.

    64

    00:05:27,685 --> 00:05:34,165

    Make sure they 100% pipette, 100% can work in a septic environment.

    65

    00:05:34,564 --> 00:05:38,805

    And then they familiar with upstream, downstream and the quality control technologies.

    66

    00:05:39,300 --> 00:05:39,699

    Yes.

    67

    00:05:39,699 --> 00:05:42,019

    And we have a number of different locations we do that.

    68

    00:05:42,019 --> 00:05:45,300

    We have a training we have a training lab specifically dedicated to this space.

    69

    00:05:45,379 --> 00:05:49,939

    We have our upstairs spaces, which are more, you know, your basic training rooms, more for our venue operations.

    70

    00:05:49,939 --> 00:05:57,074

    But we also have a dry lab and wet lab upstairs as well for folks who wanna be able to have kind of a mixed mode experience depending upon their training.

    71

    00:05:58,275 --> 00:05:59,475

    Well, that sounds Sorry.

    72

    00:06:00,275 --> 00:06:02,915

    That sounds very comprehensive.

    73

    00:06:04,899 --> 00:06:12,500

    But tell tell me how your use of AI, is improving how you are training your students.

    74

    00:06:13,699 --> 00:06:23,384

    So when it comes to AI, we started working with it very kind of oh, and not really early on per se because, you know, AI has been around since nineteen forties, technically speaking.

    75

    00:06:23,384 --> 00:06:23,865

    Right?

    76

    00:06:24,345 --> 00:06:30,985

    So but what we're talking about when we use AI is, generally speaking, we're using, we started using generative AI.

    77

    00:06:31,305 --> 00:06:40,580

    So what that means in our case is that in 2000 in 2023, all of our we started with our apprentices, and we went, okay.

    78

    00:06:40,580 --> 00:06:46,420

    How do we make sure to give them the readiness they need to work in the in the in the future for the future workforce.

    79

    00:06:46,420 --> 00:06:46,900

    Right?

    80

    00:06:46,900 --> 00:06:50,154

    And so what we did was we started identifying.

    81

    00:06:50,154 --> 00:06:57,995

    We saw there was a there was a paper from the federal from the Fed that explained what the strong suits and weeks and weaknesses of are these are these tools.

    82

    00:06:57,995 --> 00:06:58,634

    And we said, okay.

    83

    00:06:58,634 --> 00:07:01,675

    Well, it seems we're weak research is a bit of a a weak point.

    84

    00:07:01,675 --> 00:07:01,914

    Why?

    85

    00:07:02,310 --> 00:07:09,430

    And we knew that research is a very critical element, these students are learning, and these apprentices are learning how to really conduct appropriate research.

    86

    00:07:09,589 --> 00:07:11,430

    And so we wanted to really test that.

    87

    00:07:11,430 --> 00:07:21,245

    So we started doing what's called a parallel project where everything that they would do, they would have the they would train what, what we'll call an AI assistant to do at the same time.

    88

    00:07:21,245 --> 00:07:24,845

    And so they would iterate this forth back and forth, and they've been doing this for years at this point.

    89

    00:07:24,845 --> 00:07:27,004

    So every cohort that comes on does the same thing.

    90

    00:07:27,379 --> 00:07:32,019

    And so everything that they're doing, when they do it, they do it in parallel with the actual AI assistant.

    91

    00:07:32,099 --> 00:07:33,220

    They create the assistant.

    92

    00:07:33,220 --> 00:07:34,339

    They build the elements in there.

    93

    00:07:34,339 --> 00:07:45,264

    And so that this way, what they're doing, creates a little bit of that that tacit record kind of gets a little bit more under better understood because it also means that when, inevitably,

    94

    00:07:45,504 --> 00:07:53,985

    the apprentice or intern leaves because we are an institute that takes an apprentices and and and interns on in six month, year, two year, they differ.

    95

    00:07:54,144 --> 00:07:56,625

    And so a project, you know, might suffer from a gap.

    96

    00:07:56,625 --> 00:07:59,839

    And so this allows the next person to kind of pop up and go, oh, look.

    97

    00:07:59,839 --> 00:08:00,560

    Here's the assistant.

    98

    00:08:00,560 --> 00:08:02,240

    Let's let's help them kinda carry along.

    99

    00:08:02,240 --> 00:08:03,919

    It ends up kind of carrying these gaps forward.

    100

    00:08:03,919 --> 00:08:09,839

    And so it ended up being a really useful tool to kind of carry along this tribal knowledge that normally kind of can get a little bit lost along the way.

    101

    00:08:10,319 --> 00:08:18,295

    It also meant that in this way, they were also testing their own critical thinking skills because that was one of the elements that I found out early on is that it was bad.

    102

    00:08:18,694 --> 00:08:28,879

    Early generative AI in in in 2023, was, you know, maybe at best, had hit about a 10% ratio in terms of the success rate to actually

    103

    00:08:28,879 --> 00:08:31,759

    get the link and the name and all the rest of materials right.

    104

    00:08:32,000 --> 00:08:33,679

    Now maybe it's about 40%.

    105

    00:08:33,679 --> 00:08:34,399

    It's getting better.

    106

    00:08:34,399 --> 00:08:35,840

    And this is off the shelf tool.

    107

    00:08:35,840 --> 00:08:38,320

    So when you really hone something, and that's what they learned.

    108

    00:08:38,320 --> 00:08:40,480

    When they really honed something, they managed to make it better.

    109

    00:08:40,865 --> 00:08:42,625

    But we're not just using it there.

    110

    00:08:42,784 --> 00:08:48,625

    So that's what we started working with it there to make sure that all of our staff have a really good understanding as well as all of our apprentices.

    111

    00:08:48,625 --> 00:08:57,550

    Anyone who really walks through our doors, we wanna make sure that they're a little bit better off, you know, understanding what the AI landscape looks like and how to integrate it into their workflows.

    112

    00:08:57,710 --> 00:09:05,470

    When we think about the other enhancements that we we're solely putting into play is we have a lot of sensors and cameras, a lot of data that we've been collecting.

    113

    00:09:05,470 --> 00:09:09,309

    We're trying to find new ways of leveraging that, and we'll get into that a little bit later on.

    114

    00:09:09,309 --> 00:09:19,465

    But where it happens with these with these apprentices is that they get to see actively, you know, how this change occurs over time with their assistant

    115

    00:09:19,465 --> 00:09:22,345

    and see how their assistant is kind of, like, almost a partner to them.

    116

    00:09:24,059 --> 00:09:25,740

    Well, that sounds very interesting.

    117

    00:09:26,299 --> 00:09:33,259

    So besides working with your students, how are you using AI today for your day to day operations?

    118

    00:09:35,125 --> 00:09:41,845

    So one of the things that we've been doing is that we have been, kind of upgrading our sensors and upgrading our data acquisition.

    119

    00:09:42,085 --> 00:09:52,110

    So in order to properly install really generative AI, AI, predictive AI, whatever AI you wanna put in there, you really need to have good, not only good sorting

    120

    00:09:52,110 --> 00:10:01,070

    of data in terms of data governance, you need to have a plan as to what sensors you're plugging in, how you want them to work into your workflow, how do these things make sense, because if you don't have

    121

    00:10:01,070 --> 00:10:01,950

    that collection that's there.

    122

    00:10:01,950 --> 00:10:04,670

    So that's really where a large portion of where the steps are.

    123

    00:10:04,754 --> 00:10:14,835

    We're also while we're doing this, we are finishing up the pilot of our of our initial element here, which is a, an integration where

    124

    00:10:14,835 --> 00:10:21,669

    we have a a program that has generative AI elements into it, but it isn't fully generative AI.

    125

    00:10:21,669 --> 00:10:24,470

    It's a combination of a few different types of AI.

    126

    00:10:25,350 --> 00:10:31,029

    My my brother is the architect here for this, so I have to kinda lean on him for the specifics of the engineering behind it.

    127

    00:10:31,429 --> 00:10:33,190

    I'm just the guy who communicates the elements.

    128

    00:10:33,190 --> 00:10:42,365

    So when it comes to it, what it ends up doing is it allows us to access our inventory, access our systems much cleaner, much easier, and importantly, in a way that's accessible.

    129

    00:10:42,524 --> 00:10:51,090

    So one of the biggest changes that we have regarding our the tools that we have built for our own internal management, so inventory and other elements.

    130

    00:10:51,090 --> 00:10:53,410

    It's almost like an ERP system, but it's not quite there.

    131

    00:10:53,730 --> 00:10:57,649

    The idea here is that we we look at it and we say, how do these things connect?

    132

    00:10:57,649 --> 00:10:59,009

    How does this work together?

    133

    00:10:59,570 --> 00:11:06,324

    And how do we how do we make sure that we're best leveraging all of our materials and all of our time?

    134

    00:11:06,324 --> 00:11:11,365

    It's kind of like a little bit of a resource management, a little bit of better resource tools because, you know, we're a nonprofit.

    135

    00:11:11,365 --> 00:11:16,084

    We have to leverage with what we can and what we have, and we're we're scrappy at the end of the day.

    136

    00:11:17,320 --> 00:11:22,120

    We'll scrapping a 45,000 square foot facility, as much as that happens.

    137

    00:11:22,519 --> 00:11:32,675

    But, the the main thing about this, though, is that this allows us to because of the generative AI element integrated into it and because we've worked with so

    138

    00:11:32,675 --> 00:11:42,434

    many of these, not just apprentices, but also our our, our various, employees, myself, I am a a person with a few different disabilities if my glasses don't give me away.

    139

    00:11:43,394 --> 00:11:49,970

    And and what that means is that it can be difficult for me to read certain types of screens.

    140

    00:11:50,210 --> 00:11:53,009

    Like, for instance, I have visual dyslexia.

    141

    00:11:53,009 --> 00:11:55,730

    When I look at a screen, I I kinda mix up letters and words a little bit.

    142

    00:11:55,730 --> 00:11:57,889

    Actually, I mix up concepts, which is a little more complex.

    143

    00:11:57,889 --> 00:12:04,934

    But, anyway, point is that what ends up happening is that when I look at a screen, it becomes difficult for me to particularly read certain types of fonts.

    144

    00:12:04,934 --> 00:12:13,095

    And so what we've done is we've actually had a whole bunch of assistive, components that actually completely change the interface so that this way it adapts to the user.

    145

    00:12:13,095 --> 00:12:17,570

    And the user can change however they need to, and it doesn't matter because the whole system works together.

    146

    00:12:17,649 --> 00:12:21,809

    So if somebody has, you know, vision problems like I do, I can easily adjust it.

    147

    00:12:21,809 --> 00:12:23,490

    I can put it on a high contrast mode.

    148

    00:12:23,490 --> 00:12:26,930

    I can put it on all these different modes so that this way it doesn't matter if I'm color blind.

    149

    00:12:26,930 --> 00:12:30,370

    It doesn't matter if I have any sort of visual component that's that's challenged there.

    150

    00:12:30,585 --> 00:12:32,024

    It's also screen reader friendly.

    151

    00:12:32,024 --> 00:12:36,904

    So this way, we can have folks who maybe have even more visual problems than I do working in there.

    152

    00:12:36,904 --> 00:12:47,029

    And so the benefit of this tool is that it's really, really inclusive in terms of how it's able to personalize for all these individuals who may have different needs or different interests because

    153

    00:12:47,029 --> 00:12:49,190

    maybe you don't use the tools in the same way.

    154

    00:12:49,190 --> 00:12:51,589

    And so the system is a bit more adaptive.

    155

    00:12:51,750 --> 00:12:59,909

    It actually is able to kind of adapt to the user, which is really important because every user really has different concerns about what matters to them.

    156

    00:13:01,654 --> 00:13:08,774

    So, you mentioned that this system is pulling data from a number of sensors and other sources.

    157

    00:13:09,254 --> 00:13:19,759

    Does it do kind of old fashioned statistical process control, charts, and that sort of thing, from a purely statistical analysis

    158

    00:13:19,759 --> 00:13:20,639

    point of view?

    159

    00:13:20,720 --> 00:13:26,720

    And can the user, basically kinda create their own views of data?

    160

    00:13:27,985 --> 00:13:28,465

    Yeah.

    161

    00:13:28,465 --> 00:13:30,705

    So it on the back end, yes.

    162

    00:13:30,705 --> 00:13:34,144

    It's it's it's it's that it's big charts and tables.

    163

    00:13:34,225 --> 00:13:38,065

    But in the front end, the user can completely customize whatever that appearance says.

    164

    00:13:38,065 --> 00:13:44,809

    And that goes for all of our tools, including probably one of our most essential tools, which is DAISOPs.

    165

    00:13:44,889 --> 00:13:47,049

    It's a learning tool that happens daily.

    166

    00:13:47,289 --> 00:13:54,649

    And and with that tool, they're able to keep up to date with whatever new thing that is going on in our in our building.

    167

    00:13:55,524 --> 00:13:58,004

    We'll have our trainings, we'll have that element in there.

    168

    00:13:58,245 --> 00:14:05,205

    And there, again, that adaptive approach in terms of how the actual appearance works for that user is really front and center.

    169

    00:14:05,205 --> 00:14:08,165

    Pretty very quickly, you can click on it and adjust it as you need to.

    170

    00:14:09,820 --> 00:14:14,220

    Any other ways that, AI is being used by your clients?

    171

    00:14:16,779 --> 00:14:23,419

    So with AI being used to our clients, I can't explicitly say what some of our clients are doing behind the doors.

    172

    00:14:23,419 --> 00:14:29,554

    But but with us, what happens, we have a number of different ways we utilize AI with our clients.

    173

    00:14:29,634 --> 00:14:39,899

    One is that anyone doing a project with us, all of our since all of our, all of our operators are trained in use of AI, they can better understand

    174

    00:14:39,899 --> 00:14:43,339

    where in the workflow AI needs to be integrated.

    175

    00:14:43,339 --> 00:14:46,059

    And the thing is that's, think, one of the more difficult components.

    176

    00:14:46,059 --> 00:14:49,019

    When I have discussions with a lot of folks, they really don't know where AI fits.

    177

    00:14:49,019 --> 00:14:53,259

    Most of the time, they're like, oh, I know we have this great generative AI tool, but, like, what does it do?

    178

    00:14:53,259 --> 00:14:54,059

    Where do I put this?

    179

    00:14:54,514 --> 00:15:01,154

    And so what we've done is we've really given the operators a really good understanding as to how the tools can work.

    180

    00:15:01,154 --> 00:15:03,875

    This way, when they see a project, they go, I know where to put AI.

    181

    00:15:03,875 --> 00:15:04,835

    Let's put it here.

    182

    00:15:04,914 --> 00:15:05,634

    Let's put it here.

    183

    00:15:05,634 --> 00:15:06,195

    Let's put it here.

    184

    00:15:06,440 --> 00:15:11,959

    This will help reduce the overall burden on our workflow, help make sure that we can get it done in a lot less time.

    185

    00:15:11,959 --> 00:15:13,799

    And I think that's an important component.

    186

    00:15:13,799 --> 00:15:21,720

    It's really a combination of our critical thinking skills from the operator as well as understanding where to put the AI tools along the workflow for our clients.

    187

    00:15:21,965 --> 00:15:32,125

    And then also for any clients who come in and do a training on our site, we offer both, DISOPs, which is a competency assessment tool,

    188

    00:15:32,605 --> 00:15:42,709

    to any training that happens on our site, as well as our survey tool, Compare, which essentially is a qualitative, survey that doesn't really

    189

    00:15:42,709 --> 00:15:52,309

    concern itself so much as as the idea is, like, a normal survey where it asks similar questions, but instead actually looks to to analyze the answers because that's really what matters in a qualitative

    190

    00:15:52,309 --> 00:15:52,629

    approach.

    191

    00:15:52,944 --> 00:15:59,105

    And so those tools are ones we offer to any client who comes on-site, who wants to do a training, who wants to do a project.

    192

    00:15:59,105 --> 00:16:06,464

    Because the DISOPs also allow for something called, you know you know, something that's really familiar to a lot of us in in management, which is risk management.

    193

    00:16:06,929 --> 00:16:16,450

    Because the benefit of DISOPs is that you assess the competency of the individual as they go, you assess the competency as they learn, and you get a better understanding as to where their strengths and

    194

    00:16:16,450 --> 00:16:24,075

    weaknesses are in a way that normal, you know, read and understand SOPs just probably doesn't capture.

    195

    00:16:25,675 --> 00:16:30,315

    I'm not saying it's not the industry standard, but I'm just saying maybe it's not the best.

    196

    00:16:31,035 --> 00:16:32,554

    Maybe there are better approaches.

    197

    00:16:32,554 --> 00:16:37,195

    I mean we were ready to we learned to cheat on multiple choice tests.

    198

    00:16:37,670 --> 00:16:43,029

    So this is non cheating because the answers, there is no good answer or bad answer.

    199

    00:16:43,029 --> 00:16:47,509

    It statistically compares it to the expert answer.

    200

    00:16:47,590 --> 00:16:58,245

    You actually can assess the real skills and also see how you can position, for example, an employee in a work environment

    201

    00:16:58,564 --> 00:17:09,640

    to handle the risk and who are to be the one who are going to be able to handle the risk, who are the ones who can be promoted based on their knowledge more

    202

    00:17:09,640 --> 00:17:14,759

    into supervisory position rather than basing it on other elements.

    203

    00:17:14,759 --> 00:17:25,015

    And also, a bunch of us, our family have that we have perfect training records and yet there are human errors in a production.

    204

    00:17:25,975 --> 00:17:31,815

    We can read out the sources before actually the losses occur due to this human error.

    205

    00:17:31,815 --> 00:17:38,169

    So this is where we see a big advantage for the ISOP that it gives you a transparent tool.

    206

    00:17:38,329 --> 00:17:42,809

    It also potentially gives you a transparent tool to go for interviews.

    207

    00:17:42,809 --> 00:17:48,650

    People are switching jobs, and this could be a portable record that they can bring with themselves.

    208

    00:17:48,650 --> 00:17:58,825

    So instead of lying on the resume that I did this or I did that, and I've just seen it from far away, people can actually the new employers can actually see what they did

    209

    00:17:58,825 --> 00:18:02,184

    and how competent they are in certain processes.

    210

    00:18:02,184 --> 00:18:09,730

    So and ultimately, this reduces the human errors, which for some of the pharma companies can cost hundreds of millions a year.

    211

    00:18:09,809 --> 00:18:10,130

    Yes.

    212

    00:18:10,130 --> 00:18:20,345

    And I mean, maybe not lying, but maybe some people inflate maybe what they did or something and understanding their competency directly in a traceable, transparent record

    213

    00:18:20,345 --> 00:18:21,704

    is just a lot better.

    214

    00:18:22,105 --> 00:18:28,984

    It is a much more portable digital certificate than, you know, you're gonna find from some other digital service, you know, system.

    215

    00:18:28,984 --> 00:18:30,825

    We we wanted to make it useful to us.

    216

    00:18:31,039 --> 00:18:31,359

    Mhmm.

    217

    00:18:31,359 --> 00:18:32,960

    So we wanted to make it useful to the industry.

    218

    00:18:32,960 --> 00:18:36,960

    We wanted to to make it so that this way, you know, people would benefit from it.

    219

    00:18:36,960 --> 00:18:45,119

    Because, you know, we see the challenge with all of our with all of our trainees, with all of our with all of our new folks, you know, because we work a lot with colleges in terms of getting hired.

    220

    00:18:45,119 --> 00:18:51,325

    Because as you mentioned, it can be very difficult, you know, finding the right person for the role, figuring out what they're good at, what they're not.

    221

    00:18:51,404 --> 00:18:56,845

    These are all things that can be, you know, can be can help, this tool can help kind of delineate.

    222

    00:18:56,845 --> 00:19:01,440

    And so and, also, DISOP stands for digitally interactive standard operating procedures.

    223

    00:19:01,440 --> 00:19:03,200

    I probably should have mentioned that earlier.

    224

    00:19:03,599 --> 00:19:07,359

    But it is not actually the sense of that your SOPs are digitally interactive.

    225

    00:19:07,359 --> 00:19:11,599

    It's that this is a set of SOPs that are digitally interactive for the learning process.

    226

    00:19:11,599 --> 00:19:14,880

    They are separate and not part of the DXP process, I should say.

    227

    00:19:16,644 --> 00:19:20,404

    So to go off on a tangent for a second Sure.

    228

    00:19:20,404 --> 00:19:27,285

    Was was it University of San Diego where the undergrads were a little bit deficient in math?

    229

    00:19:28,650 --> 00:19:39,130

    And do you see this kind of an AI application, being kind of a, objective measure

    230

    00:19:39,130 --> 00:19:40,970

    of student capability?

    231

    00:19:41,764 --> 00:19:50,244

    And do you think it will get broader acceptance and recognition as kind of a certificate?

    232

    00:19:50,884 --> 00:20:01,229

    I hope so because this is a transparent record and a lot of pipeline, workforce pipeline that's coming down, they can be on the spectrum.

    233

    00:20:01,629 --> 00:20:07,949

    They don't necessarily very well, but they actually could work very well in a GMP environment.

    234

    00:20:08,269 --> 00:20:13,734

    And this could be a transparent record for them that they're not screened out in HR.

    235

    00:20:13,734 --> 00:20:18,454

    But I think it should replace eventually the multiple choice questions.

    236

    00:20:18,454 --> 00:20:26,989

    So right now, hiring was based on the reputation of the institution, but doesn't necessarily mean that everybody who graduates from there are really great.

    237

    00:20:27,390 --> 00:20:30,750

    Now there is how well you pass multiple choice questions.

    238

    00:20:30,750 --> 00:20:37,950

    And I think by the time you get to graduate level, you already know how to cheat on those or how to beat the a little bit.

    239

    00:20:38,075 --> 00:20:42,234

    Well, you understand well, I mean, under reading a test is a whole skill in itself.

    240

    00:20:42,234 --> 00:20:44,714

    And, like so I I was very I was very good at it.

    241

    00:20:44,714 --> 00:20:50,234

    And so I could I could take a test, but in a subject I've never learned and get a b on it when never having taken the class.

    242

    00:20:50,394 --> 00:20:52,234

    Just understanding the the test itself.

    243

    00:20:52,234 --> 00:20:59,160

    And that's not a good, you know, approach of learning it, and that's also a component of it too because it may it may have been SDSU.

    244

    00:20:59,160 --> 00:20:59,720

    I'm not certain.

    245

    00:20:59,720 --> 00:21:06,680

    But the the the challenge is that some subjects can be a little bit more difficult to understand that translation.

    246

    00:21:06,680 --> 00:21:07,000

    Right?

    247

    00:21:07,000 --> 00:21:10,025

    Because they so they got an a.

    248

    00:21:10,025 --> 00:21:11,144

    So they got a b.

    249

    00:21:11,144 --> 00:21:14,424

    What is an a and a b worth in this environment?

    250

    00:21:14,664 --> 00:21:24,750

    You know, what are these grades worth in this environment when you have all of these you have both grade inflation, you have all of these components have kind of entered the picture that

    251

    00:21:24,750 --> 00:21:26,990

    that make it very difficult to discern these things.

    252

    00:21:26,990 --> 00:21:28,750

    And a transcript doesn't tell you that.

    253

    00:21:28,750 --> 00:21:30,429

    A syllabus doesn't tell you that.

    254

    00:21:30,429 --> 00:21:37,069

    A syllabus tells you what they conceptually could have learned, and it and a and a transcript tells you that they got the grade they got.

    255

    00:21:37,069 --> 00:21:39,984

    It does not give you a really good transparent record.

    256

    00:21:39,984 --> 00:21:42,625

    And the goal of this document is to say, hey.

    257

    00:21:42,785 --> 00:21:47,265

    This is where this person has has strengths, weaknesses, and this is where they've shown a lot of growth.

    258

    00:21:47,345 --> 00:21:51,025

    This is where, you know, they might be this might they might have a little more room to grow here.

    259

    00:21:51,509 --> 00:21:59,750

    And the the goal of this is is to really make that, you know, portability, make it strong, and make it so that this way you can better understand your teams.

    260

    00:21:59,750 --> 00:22:01,830

    It makes everyone be able to work better together.

    261

    00:22:01,830 --> 00:22:08,454

    If we understand where someone is, it's a lot easier to communicate with them what their needs are and how they're able to work with the team.

    262

    00:22:08,454 --> 00:22:14,454

    And specifically in pharma, we have a risk based management for our whole operation.

    263

    00:22:15,015 --> 00:22:25,230

    So I think this could address the risk part you can put the right people who potentially in a more risky situation in your process versus putting

    264

    00:22:25,230 --> 00:22:33,815

    somebody who shown less familiarity with the process yet into a different position.

    265

    00:22:33,815 --> 00:22:38,855

    And I think that could actually mean millions for a pharma company, putting the right people.

    266

    00:22:38,855 --> 00:22:47,950

    So a lot of times in real operation, you find out if somebody really knows what they're supposed to know when they made a mistake and say, oops, it's expensive.

    267

    00:22:47,950 --> 00:22:58,109

    So this way, you could know ahead of the time if they're of doing it or likely to work well, and you can start to position your employees

    268

    00:22:58,109 --> 00:22:58,909

    based on risk.

    269

    00:23:00,284 --> 00:23:09,404

    And I think that's for us as a nonprofit, a big advantage to see who assess who knows what.

    270

    00:23:09,325 --> 00:23:11,565

    I think it's translate to pharma too.

    271

    00:23:11,640 --> 00:23:12,599

    We gotta be lean.

    272

    00:23:12,599 --> 00:23:13,320

    We gotta be lean.

    273

    00:23:13,320 --> 00:23:13,559

    Right?

    274

    00:23:13,559 --> 00:23:18,839

    And, you know, if if we have the lean practices, we hope for other folks to be able to learn from from our space.

    275

    00:23:18,839 --> 00:23:19,559

    That's what we do.

    276

    00:23:19,559 --> 00:23:19,799

    Right?

    277

    00:23:19,799 --> 00:23:20,839

    We we test it.

    278

    00:23:20,839 --> 00:23:22,359

    We we we're the sandbox.

    279

    00:23:22,519 --> 00:23:29,575

    We're a space for you to play around with, try out these new tactics, these new equipment, these new these new, implementations, whatever you wanna try out.

    280

    00:23:29,575 --> 00:23:32,214

    We want you to try it out here, see how it works.

    281

    00:23:32,214 --> 00:23:39,575

    That's why we try to produce this information to make it ready for folks so that folks in the industry can find a way to integrate it into their own systems and say, hey.

    282

    00:23:40,309 --> 00:23:41,990

    This is how we wanna improve.

    283

    00:23:42,069 --> 00:23:43,990

    We wanna reduce the cost of pharmaceuticals.

    284

    00:23:43,990 --> 00:23:46,869

    You know, my my bill to insurance total was $6,000,000.

    285

    00:23:46,950 --> 00:23:48,710

    You know, I'm a I'm a liver transplant patient.

    286

    00:23:48,710 --> 00:23:55,349

    I was cured of hepatitis c, you know, almost thirty years after I was, you know, because I was born with it.

    287

    00:23:56,204 --> 00:23:59,724

    And so, you know, to me, that reducing that cost is big.

    288

    00:23:59,724 --> 00:24:05,484

    And all the all of our efforts, all of our things here are really in that vein of trying to reduce that cost.

    289

    00:24:05,484 --> 00:24:08,365

    And we think industry adoption is definitely an important component.

    290

    00:24:08,365 --> 00:24:08,924

    Yes.

    291

    00:24:08,899 --> 00:24:19,299

    And so I think battery and workforce that has 100% competency on key technologies have improved on actually every step in the manufacturing

    292

    00:24:19,299 --> 00:24:20,659

    of what they need to do.

    293

    00:24:20,659 --> 00:24:28,704

    Is it like measuring liquid, measuring solid, transferring liquid, transferring solid, working in our septic environment.

    294

    00:24:28,704 --> 00:24:35,505

    So we basically broke down the basic steps, make sure they have the competency, and we assess it with the DISOP.

    295

    00:24:36,210 --> 00:24:46,450

    And I think, provide reduction of cost for pharma, having less human error, which related losses, which is obviously

    296

    00:24:46,450 --> 00:24:53,494

    built into the cost of the drug actually could be a big savings for everybody all around.

    297

    00:24:53,494 --> 00:24:57,894

    And these are expenses from these losses that shouldn't occur.

    298

    00:24:58,695 --> 00:25:01,894

    For some of the large pharma, it's hundreds of millions a year.

    299

    00:25:04,480 --> 00:25:13,039

    Does your operation create standard deviations just like pharma manufacturing?

    300

    00:25:13,599 --> 00:25:14,239

    Mhmm.

    301

    00:25:14,559 --> 00:25:15,279

    Yes.

    302

    00:25:15,599 --> 00:25:17,119

    So we work with yeah.

    303

    00:25:17,119 --> 00:25:20,984

    We basically have a GMP system.

    304

    00:25:21,065 --> 00:25:24,184

    Actually, all students build a system.

    305

    00:25:24,184 --> 00:25:34,410

    So build, you know, GMP quality system, and they operate the system so they understand that in and out of how this works, which rarely is a possibility

    306

    00:25:34,410 --> 00:25:41,850

    for that because in pharma, you get and see a fraction of the whole GMP quality system.

    307

    00:25:41,850 --> 00:25:52,224

    And with us, it's we want to see we want to make sure that our graduates actually see the whole quality system and understand how the different

    308

    00:25:52,224 --> 00:25:53,664

    wheels work together.

    309

    00:25:53,825 --> 00:25:55,825

    So yes, so that's part of it.

    310

    00:25:55,984 --> 00:26:06,059

    So where I was going with this is many times a deviation gets resolved, quote, unquote, by saying, well,

    311

    00:26:06,059 --> 00:26:11,339

    it was operator error, and we retrained the operator.

    312

    00:26:11,339 --> 00:26:11,980

    Yes.

    313

    00:26:12,299 --> 00:26:12,779

    Okay?

    314

    00:26:12,779 --> 00:26:23,055

    Now so the training was inadequate to begin with, and we're gonna retrain the operator on a training system that was inadequate

    315

    00:26:23,055 --> 00:26:23,934

    to begin with.

    316

    00:26:23,934 --> 00:26:24,734

    Exactly.

    317

    00:26:24,815 --> 00:26:27,134

    That's the logic that I see here.

    318

    00:26:27,134 --> 00:26:27,934

    But Yeah.

    319

    00:26:27,934 --> 00:26:35,569

    Would your AI be able to kind of cry foul and say, wait a minute.

    320

    00:26:35,890 --> 00:26:46,134

    The human error was either, okay, the individual really screwed up or the training was not good enough or

    321

    00:26:46,134 --> 00:26:55,015

    the SOP is confusing or there's some other root cause in the manufacturing system that needs to be addressed.

    322

    00:26:55,015 --> 00:26:55,414

    Yeah.

    323

    00:26:55,414 --> 00:26:58,934

    Do you see AI getting us to that place?

    324

    00:26:59,369 --> 00:27:00,410

    Oh, I mean, absolutely.

    325

    00:27:00,410 --> 00:27:02,169

    That's exactly before.

    326

    00:27:02,170 --> 00:27:11,690

    So so one of the things one of the first projects, you know, that because on the side, my brother and I, you know, we we we do consulting for generative AI.

    327

    00:27:12,025 --> 00:27:16,744

    And one of the things we did was one of the first projects we did was on deviations, was around deviations.

    328

    00:27:17,304 --> 00:27:20,825

    And the thing is, you know, this is something that we see commonly now.

    329

    00:27:21,065 --> 00:27:29,579

    All of the big players all have AI built into looking at their deviations because it is just it's it's a great way to summarize a lot of elements and reduce a lot of the the the hours.

    330

    00:27:29,579 --> 00:27:32,139

    It's a pretty low hanging fruit, generally.

    331

    00:27:32,299 --> 00:27:37,579

    But the thing is that part that you touched on was is not something I hear a lot about.

    332

    00:27:38,044 --> 00:27:39,085

    They always hear about, hey.

    333

    00:27:39,085 --> 00:27:41,884

    I'm moving it, but they're like, ah, but we don't wanna solve the problem, though.

    334

    00:27:41,884 --> 00:27:45,964

    That that part is a bit absent, which is unfortunate because I think it's such a missed opportunity.

    335

    00:27:45,964 --> 00:27:47,244

    And I think you're right.

    336

    00:27:47,244 --> 00:27:56,210

    I and I think this is one of the areas where, you know, we're going to see our when our deviations come out come into play, and we see that element that says, hey.

    337

    00:27:56,210 --> 00:27:57,489

    This is a human error.

    338

    00:27:57,730 --> 00:28:02,049

    We happen to have the training record that's very transparent, that's very that's very traceable test.

    339

    00:28:02,049 --> 00:28:02,529

    Oh, look.

    340

    00:28:02,529 --> 00:28:07,490

    Looks like they're we see right here that they're missing this specific pipetting skill.

    341

    00:28:07,545 --> 00:28:12,345

    What we see now why this did not work, well, because they're not doing very well in this.

    342

    00:28:12,345 --> 00:28:18,265

    Let's go back and retrain them on this specific area so we don't have to waste time with all the rest of this because this is the real weak point.

    343

    00:28:18,585 --> 00:28:21,819

    Focus on that, and then we'll probably see a lot better results.

    344

    00:28:21,819 --> 00:28:26,380

    Now I say probably because we haven't instituted that yet, but we will see that happen in the future.

    345

    00:28:26,380 --> 00:28:27,419

    I'm certain of it.

    346

    00:28:27,500 --> 00:28:28,059

    Yeah.

    347

    00:28:28,299 --> 00:28:28,619

    Yeah.

    348

    00:28:28,619 --> 00:28:35,335

    I mean, using AI to write a deviation that, you know, okay.

    349

    00:28:35,335 --> 00:28:37,974

    That's kind of an efficiency tool.

    350

    00:28:37,974 --> 00:28:38,454

    Yeah.

    351

    00:28:38,534 --> 00:28:41,094

    But AI is statistics.

    352

    00:28:41,095 --> 00:28:49,095

    And using AI to statistically say, you really think it's it's retraining is gonna work?

    353

    00:28:49,095 --> 00:28:56,169

    Well, my statistics tell me that, you have a ten percent chance of being successful with that approach.

    354

    00:28:56,490 --> 00:28:57,769

    Do you think we'll get there?

    355

    00:28:57,930 --> 00:28:58,490

    Yes.

    356

    00:28:58,490 --> 00:28:59,450

    Oh, absolutely.

    357

    00:28:59,450 --> 00:29:02,650

    That's that's kind of the goal to get there.

    358

    00:29:02,650 --> 00:29:08,265

    But the other side of it, we also train our students focusing of the language.

    359

    00:29:08,265 --> 00:29:10,585

    So they actually have to write their SOPs.

    360

    00:29:11,065 --> 00:29:18,424

    And we're looking at how things in an SOP could lead to human error and how we can eliminate that.

    361

    00:29:18,820 --> 00:29:28,980

    So we're training them also to how to write SOPs and without things like, for example, if you have a sentence that has two things in it, typically the second

    362

    00:29:28,980 --> 00:29:39,265

    part of the sentence where the error is going to happen because if you the SOP say, do this and then do that, the second part sometimes get where

    363

    00:29:39,265 --> 00:29:40,224

    the error happens.

    364

    00:29:40,224 --> 00:29:47,825

    So we basically write, train them how to write SOPs to avoid human errors and set it up properly.

    365

    00:29:48,470 --> 00:29:52,789

    And then, so we're looking at that as a potential resource as well.

    366

    00:29:52,789 --> 00:30:02,914

    And I think, yes, it's AI going to come in because AI helps us with that to sort out some of these basic elements that based on human behavior already

    367

    00:30:02,914 --> 00:30:11,394

    shown in other industries that leads to human errors, like a nuclear power or space industries.

    368

    00:30:11,394 --> 00:30:11,875

    Yes.

    369

    00:30:11,875 --> 00:30:19,519

    And at the end of the day, AI can be integrated with SOPs very SOPs are kind of the language of a lot of generative AI kind of anyway.

    370

    00:30:19,599 --> 00:30:25,279

    If you really think about the if you break down how an LM works, I mean, you're talking about the tokens, each of the words.

    371

    00:30:25,279 --> 00:30:25,519

    Right?

    372

    00:30:25,519 --> 00:30:28,720

    And words are things that modify words or what tokens are.

    373

    00:30:28,960 --> 00:30:38,875

    And and so what that means in this instance is with an SOP, it's just a series of tokens that all interrelate, and you have the right vectors that associate with them, vectors being that they tie the things

    374

    00:30:38,875 --> 00:30:41,034

    closest to one another and associated objects.

    375

    00:30:41,034 --> 00:30:46,990

    The the idea here is that the SOP is actually a really good framework generally for an LLM to understand.

    376

    00:30:47,150 --> 00:30:52,589

    An LLM being a large language model like like ChatGPT or any of those other tools, DeepSea.

    377

    00:30:52,589 --> 00:30:56,109

    Any any of the tools, even if it's off or closed or open, doesn't matter.

    378

    00:30:56,190 --> 00:31:02,234

    The idea here is that, you know, when we use a generative AI tool, which is an LLM, sorry for using all these language here.

    379

    00:31:02,234 --> 00:31:02,954

    Don't worry.

    380

    00:31:02,954 --> 00:31:07,994

    There'll be you can learn more with AI literacy stuff that we do with the AI COP.

    381

    00:31:08,474 --> 00:31:11,594

    Just a plug for the ISP, AI COP for a second.

    382

    00:31:12,234 --> 00:31:13,514

    You know, please do join us.

    383

    00:31:13,514 --> 00:31:14,794

    We are working more with AI literacy.

    384

    00:31:14,794 --> 00:31:15,194

    It's great.

    385

    00:31:16,490 --> 00:31:22,329

    But the idea here being that with SOPs, they're a prime opportunity as well.

    386

    00:31:22,730 --> 00:31:29,529

    And maybe the way they're designed, maybe whether they're structured, like you're saying that order isn't necessarily the best way, maybe there's a way to make that better.

    387

    00:31:29,845 --> 00:31:37,525

    And one of the nice things about this is not only with, with these tools can you suddenly make sweeping changes to different SOPs.

    388

    00:31:37,605 --> 00:31:40,325

    You can so, obviously, you're taking them out of your quality system.

    389

    00:31:40,325 --> 00:31:42,005

    You're not gonna do this live in a real system.

    390

    00:31:42,005 --> 00:31:52,299

    But, you know, in a in a in a in a more of a sandbox environment, you can really kind of, you know, tweak these SOPs to to protect them in a way that is you know, would take a long,

    391

    00:31:52,299 --> 00:31:54,859

    long time without generative AI.

    392

    00:31:55,340 --> 00:32:00,779

    You know, we've been experimenting with a few different ways to look at SOPs generally that are not part of that quality system.

    393

    00:32:00,974 --> 00:32:06,095

    So this way, we can see them from a learning standpoint, and we're wondering how that will eventually translate.

    394

    00:32:06,335 --> 00:32:11,214

    And, you know, we'll talk more about that probably near the end of the year once we've got a lot more data on that.

    395

    00:32:11,615 --> 00:32:14,609

    But, you know, for now, this is where we are.

    396

    00:32:14,609 --> 00:32:24,769

    Reading these issues from a pharmaceutical process and looking at the training record, looking at what could lead

    397

    00:32:24,769 --> 00:32:31,134

    to error in an SOP and potentially DCR the SOP and upgrade it to less human errors.

    398

    00:32:31,134 --> 00:32:34,014

    Think AI is a great tool for that.

    399

    00:32:34,575 --> 00:32:37,375

    So let me ask two related questions.

    400

    00:32:37,375 --> 00:32:42,255

    They both are around the subject of visual or visualization.

    401

    00:32:43,130 --> 00:32:50,809

    One, are you looking at how to integrate video into an SOP?

    402

    00:32:51,210 --> 00:32:58,025

    And two, are you using vision systems to assess operator competency?

    403

    00:32:59,865 --> 00:33:02,904

    So vision is an interesting situation.

    404

    00:33:02,904 --> 00:33:10,664

    So one thing that we've been attempting to do with sensors and camera is really understand how someone is meeting the SOP.

    405

    00:33:10,859 --> 00:33:18,859

    And that's something that we've been kind of working on for a long time, I think it's gonna take a a little while yet to really understand exactly how somebody someone is moving or is doing something and

    406

    00:33:18,859 --> 00:33:24,940

    how that matches an SOP because of the way people are physically positioned, and there are many other factors involved in that.

    407

    00:33:24,940 --> 00:33:28,565

    So that's why it's gonna take a little bit longer to better understand that element.

    408

    00:33:28,804 --> 00:33:38,325

    But there are obvious other ones, like having a more more kind of digital SOP where it is it does, like you're saying, have a have video.

    409

    00:33:38,325 --> 00:33:41,524

    Say, oh, well, this is something we don't do very often, but it's very critical.

    410

    00:33:41,640 --> 00:33:51,880

    Here's a video of an operator exactly going through with it and explaining step by step along with that the SOP already does, but reiterating it again and showing it as they are doing it right

    411

    00:33:51,880 --> 00:33:52,359

    through there.

    412

    00:33:52,359 --> 00:34:00,894

    And I think we've seen, we've worked with with groups doing, AR as well on that approach and saying how this works with AR, which is useful.

    413

    00:34:01,214 --> 00:34:08,255

    However, the the challenge I think with, you know, with with VR and with AR is, again, the certain limitations of of the tool itself.

    414

    00:34:08,255 --> 00:34:08,494

    Right?

    415

    00:34:08,494 --> 00:34:11,614

    So with with VR, it's a matter of, like, I can't use VR.

    416

    00:34:11,614 --> 00:34:18,110

    Like, I'm too my my my eyes are such that, you know, it is very difficult for me to be on VR for a time.

    417

    00:34:18,269 --> 00:34:21,150

    And I have colleagues who get vertigo very quickly on them as well.

    418

    00:34:21,150 --> 00:34:23,070

    So it it it does have limitations.

    419

    00:34:23,389 --> 00:34:28,844

    That being said, watching the video or seeing a video on an iPad while you're there in there and saying, okay.

    420

    00:34:28,844 --> 00:34:30,125

    Well, this is what I need to do.

    421

    00:34:30,125 --> 00:34:31,164

    I can see that.

    422

    00:34:31,164 --> 00:34:36,605

    Or having a tool you can kinda show up over the iPad or the augmented reality situation, that's really amazing.

    423

    00:34:36,900 --> 00:34:38,179

    Being able to kinda say, okay.

    424

    00:34:38,179 --> 00:34:42,579

    Well, here's where my next steps are following that SOP along those ways.

    425

    00:34:42,659 --> 00:34:43,619

    Phenomenal.

    426

    00:34:44,099 --> 00:34:54,235

    Something that we definitely will look at once we have, you know, more operations going in our in our actual in in our clean rooms because at the end of the day, until that happens, we're

    427

    00:34:54,235 --> 00:34:55,195

    not on that.

    428

    00:34:55,195 --> 00:34:57,275

    We we can't produce data or check on it.

    429

    00:34:57,275 --> 00:35:04,315

    But in terms of our training lab, where we are there, there, you know, it's we're mostly looking at how are their pipetting skills?

    430

    00:35:04,315 --> 00:35:05,594

    How are they working under the hood?

    431

    00:35:05,869 --> 00:35:07,630

    How are they working at the microscope?

    432

    00:35:07,630 --> 00:35:11,949

    How are they how are they, you know, working in these more more traditional laboratory settings?

    433

    00:35:11,949 --> 00:35:13,150

    So we look at it there.

    434

    00:35:13,550 --> 00:35:18,829

    We're looking at both the video that's, being taken, and there they can kind of set up a a sample video.

    435

    00:35:18,829 --> 00:35:18,989

    Yeah.

    436

    00:35:19,985 --> 00:35:20,864

    So I think Alright.

    437

    00:35:20,864 --> 00:35:30,945

    Let's question the the videos could be distracted, but I think, you know, learning in the first place and have the training and the training completed so

    438

    00:35:30,945 --> 00:35:32,625

    they're competent with the SOP.

    439

    00:35:32,625 --> 00:35:34,720

    I think it's kind of the good way to go.

    440

    00:35:34,720 --> 00:35:35,200

    Yes.

    441

    00:35:36,160 --> 00:35:44,320

    So let's shift the focus a little bit to Open BioPharma and its commercial endeavors.

    442

    00:35:44,320 --> 00:35:45,599

    How are you funded?

    443

    00:35:45,599 --> 00:35:48,079

    Is it public, private or a mix?

    444

    00:35:49,144 --> 00:35:53,385

    So it's a Open Biopharma is a 501C nonprofit.

    445

    00:35:53,385 --> 00:35:58,025

    So partially, we are funded by participating in grants.

    446

    00:35:58,025 --> 00:36:01,945

    So we have workforce development grants and other grants with partners.

    447

    00:36:01,945 --> 00:36:05,320

    We partner with community colleges, universities.

    448

    00:36:05,559 --> 00:36:14,920

    We partners also across the industry, with Texas, A and M and North Carolina, and they're applying for different workforce development grants.

    449

    00:36:14,920 --> 00:36:25,135

    So all one of our core thing to make sure we develop the future workforce, and that workforce is going to reduce these human error elements and move

    450

    00:36:25,135 --> 00:36:29,934

    some of the new technologies potentially faster into the pharma space.

    451

    00:36:29,934 --> 00:36:31,135

    So that's one of our source.

    452

    00:36:31,430 --> 00:36:35,910

    We have other grant sources too, then we work as a training facility.

    453

    00:36:35,910 --> 00:36:39,750

    So we have other companies come and train or we co train with them.

    454

    00:36:39,750 --> 00:36:42,630

    We develop our own training program.

    455

    00:36:43,190 --> 00:36:52,074

    And then we also working since we had a hands on, you know, on the job training, we bring in projects from companies.

    456

    00:36:52,074 --> 00:36:52,635

    Mhmm.

    457

    00:36:52,795 --> 00:36:53,994

    Project based learning is a big one.

    458

    00:36:53,994 --> 00:36:57,675

    Project based learning, and we bring in projects from from companies.

    459

    00:36:57,755 --> 00:36:59,755

    And they sponsor the training.

    460

    00:37:00,170 --> 00:37:07,690

    For example, we agreed that this six months, the one year project, they sponsor the training and we deliver.

    461

    00:37:07,690 --> 00:37:13,849

    And this is all for our students is they have capstone projects a lot of times.

    462

    00:37:14,675 --> 00:37:17,795

    And we accomplished actually pretty good.

    463

    00:37:17,795 --> 00:37:26,994

    So there are products that's going out of this process that students has developed and helping the cell and gene therapy industry soon.

    464

    00:37:27,179 --> 00:37:27,739

    Pretty big win.

    465

    00:37:27,739 --> 00:37:29,819

    One market and it's a pretty big win.

    466

    00:37:29,819 --> 00:37:41,105

    And for them to have also an experience that they're developing product, we're also helping small companies to translate processes or process development into manufacturing and

    467

    00:37:41,105 --> 00:37:43,264

    try to fill that gap.

    468

    00:37:43,264 --> 00:37:52,864

    And this is the gap where CDMOs are not making money, but doing it the wrong way can cause the company their life or could cause their investment.

    469

    00:37:52,864 --> 00:37:55,824

    So we're trying to help us a nonprofit in this early space.

    470

    00:37:56,380 --> 00:37:58,699

    Too, and we're bringing in several projects.

    471

    00:37:58,699 --> 00:38:00,460

    And then we have industry partners.

    472

    00:38:00,460 --> 00:38:06,139

    We're working with an application data, sales force training and other application.

    473

    00:38:06,139 --> 00:38:16,525

    And then we started also series of conferences or mini symposia around CMC.

    474

    00:38:16,525 --> 00:38:18,204

    We found a great need for that.

    475

    00:38:18,204 --> 00:38:23,484

    We had the first CMC conference in December, and we had close to 100 participants.

    476

    00:38:24,039 --> 00:38:29,880

    And so the facility was packed actually, and we continue next one in April.

    477

    00:38:29,880 --> 00:38:36,440

    So we will continue the CMC because we see the need that this is where there's a huge gap.

    478

    00:38:36,994 --> 00:38:37,554

    Yeah.

    479

    00:38:37,554 --> 00:38:39,315

    And we also host.

    480

    00:38:39,554 --> 00:38:48,594

    So we so if anyone has any training that they wanna, you know, derisk and and move off their site, you know, obviously, we have the space for it both in the training lab and then the and then the CNC and

    481

    00:38:48,594 --> 00:38:52,289

    clean room spaces as well as our training, you know, room itself.

    482

    00:38:52,289 --> 00:38:54,210

    Like I said, upstairs, dry lab, wet lab.

    483

    00:38:54,210 --> 00:38:55,890

    We have a lot of different options up there.

    484

    00:38:56,130 --> 00:38:58,210

    So we host trainings from our from organizations.

    485

    00:38:58,210 --> 00:38:59,969

    That's that's another source of revenue.

    486

    00:38:59,969 --> 00:39:01,730

    We also put on our own trainings.

    487

    00:39:01,730 --> 00:39:09,324

    We actually have a have an AI workshop series that we've launched, and we're gonna be having some different AI workshops throughout the year.

    488

    00:39:09,405 --> 00:39:11,324

    We'll be having a number of number of ones.

    489

    00:39:11,324 --> 00:39:12,925

    We have a few codeveloped ones.

    490

    00:39:12,925 --> 00:39:16,844

    We've designed them for specific agencies and then kind of expanded them.

    491

    00:39:16,844 --> 00:39:18,045

    We have one that we designed for CIRM.

    492

    00:39:18,860 --> 00:39:24,140

    That's specifically a cell workshop therapy that we give off that we that we we do every year.

    493

    00:39:24,300 --> 00:39:33,664

    And we'll be actually preparing it for industry as well because we've in conversation, it seems like a lot of folks would really like to better understand what cell and gene therapy, like, is if you're

    494

    00:39:33,664 --> 00:39:40,304

    if you're in the sales or if you're in vision development or in that space where you just kinda entered into it and you're like, well, I'm from other areas of biopharma or life sciences, but I don't know

    495

    00:39:40,304 --> 00:39:41,664

    what cell and gene therapy is.

    496

    00:39:41,664 --> 00:39:51,710

    And this way, can get a they can get an immersive two week workshop where they get to do all the fun stuff with with some different I forget what cells they work with because I I

    497

    00:39:51,710 --> 00:39:52,349

    don't work with it too many.

    498

    00:39:52,590 --> 00:39:54,349

    Work with IPSCs.

    499

    00:39:54,349 --> 00:39:56,670

    We work with Amazon cameras, stem cells.

    500

    00:39:56,670 --> 00:39:58,349

    We work with CAR T cells.

    501

    00:39:58,349 --> 00:39:59,390

    So we have T cells.

    502

    00:39:59,390 --> 00:40:00,590

    We have microphages.

    503

    00:40:00,590 --> 00:40:01,954

    We have all kinds of cells.

    504

    00:40:01,954 --> 00:40:02,114

    Yeah.

    505

    00:40:02,114 --> 00:40:04,195

    So it's a lot of fun to work around and better understand.

    506

    00:40:04,195 --> 00:40:04,515

    Yeah.

    507

    00:40:04,515 --> 00:40:07,394

    Those are less fun, I have to say, generally speaking.

    508

    00:40:07,394 --> 00:40:09,954

    More fun in a laboratory setting when it's not around you.

    509

    00:40:10,594 --> 00:40:14,594

    But, yeah, so it we have a very diverse set of different ways of funding.

    510

    00:40:14,594 --> 00:40:16,355

    We also obviously accept donations.

    511

    00:40:16,440 --> 00:40:17,000

    You know?

    512

    00:40:17,000 --> 00:40:25,880

    So if you ever see our if you ever see on our on our on our on our Zephy page where we're we have a training on there, there is a little donation button you can add at the bottom as well.

    513

    00:40:26,679 --> 00:40:32,344

    So as a nonprofit, what inspires you to do this work?

    514

    00:40:33,784 --> 00:40:36,664

    I mean, I came from medicine.

    515

    00:40:36,744 --> 00:40:40,985

    So I'm originally trained as an MD as a pediatric surgeon.

    516

    00:40:41,224 --> 00:40:44,344

    And I had kids with cancer, and my box was empty.

    517

    00:40:44,559 --> 00:40:48,719

    So I started out first on the research side, got my PhD.

    518

    00:40:48,960 --> 00:40:52,960

    And then I realized that research is a bunch of really great ideas.

    519

    00:40:52,960 --> 00:40:56,239

    But until it's not manufactured, it's nothing for a physician.

    520

    00:40:56,985 --> 00:41:07,465

    So I actually, throughout my career, I worked in different aspects of this, worked on building vaccine facility, registration of

    521

    00:41:07,704 --> 00:41:16,719

    medical device, overseeing as quality clinical studies and appreciated where the gaps are.

    522

    00:41:16,719 --> 00:41:23,280

    And as a nonprofit, we would like to be a community resource to solve some of these gaps and difficulties for the industry.

    523

    00:41:23,765 --> 00:41:26,324

    And we have the luxury of doing that.

    524

    00:41:26,324 --> 00:41:33,925

    So when companies are producing and they're under the gun to get things out, they, a lot of times, don't have the time to do this.

    525

    00:41:34,405 --> 00:41:43,829

    And we think that by providing the trained employees who already have the right thinking and the right skills, it can move into the industry.

    526

    00:41:43,829 --> 00:41:54,085

    And we feel that filling some of these gaps on the workforce side and also providing the opportunity for small companies to de risk their

    527

    00:41:54,085 --> 00:41:59,525

    CMC operation and be able to obtain financing.

    528

    00:41:59,845 --> 00:42:06,805

    These are great community resources and we're also going to move new technologies into medicine, have medicines for patients.

    529

    00:42:06,929 --> 00:42:10,210

    So these are the fun part that excites me.

    530

    00:42:10,210 --> 00:42:20,050

    And then see the young generation make a change because I think we our generation already have a routine of how we do things.

    531

    00:42:20,050 --> 00:42:24,175

    So I think to move the needle is gonna be the job for the young generation.

    532

    00:42:24,494 --> 00:42:27,215

    And we want to make sure that we train them well.

    533

    00:42:27,295 --> 00:42:27,614

    Yes.

    534

    00:42:27,614 --> 00:42:37,750

    And when when Susan came to you with this idea of what what Open Biopharma would be, you know, I at the time I had you know, ten years ago, I was very

    535

    00:42:38,710 --> 00:42:39,429

    dying.

    536

    00:42:40,070 --> 00:42:45,269

    I was an end stage liver disease at that point for almost six years, maybe seven years about that point.

    537

    00:42:45,589 --> 00:42:49,269

    And, you know, I I was dying from hepatitis c, and I contracted at birth.

    538

    00:42:49,269 --> 00:42:51,034

    And I've been on five different treatments.

    539

    00:42:51,034 --> 00:42:53,835

    And I had been that that small percentage that it didn't work for.

    540

    00:42:54,234 --> 00:42:56,714

    And, you know, my bill to insurance total was massive.

    541

    00:42:56,714 --> 00:43:03,674

    You know, by the time I got my liver transplant and was eventually cured in 2017, you know, a bill to insurance total was $6,000,000.

    542

    00:43:04,059 --> 00:43:04,859

    That's difficult.

    543

    00:43:04,859 --> 00:43:06,219

    Now my family didn't pay that.

    544

    00:43:06,219 --> 00:43:14,940

    My family paid about 350,000, maybe about $450,000 out of pocket, but that's still a hefty amount of money over a life period.

    545

    00:43:15,179 --> 00:43:17,659

    And that's really difficult, you know, to bear.

    546

    00:43:17,659 --> 00:43:24,885

    It's difficult to to see, you know, how those things even my dog agrees with that, you know, if you could hear him barking in the background right now.

    547

    00:43:24,885 --> 00:43:31,605

    So the thing is that this was this was a this was a challenge, you know, for me personally.

    548

    00:43:31,684 --> 00:43:40,239

    And, you know, the fact that my life had been saved, you know, Because when I was diagnosed at 13 in '99, there was no treatment for there was no cure for hepatitis c.

    549

    00:43:40,239 --> 00:43:41,680

    Was a treatment, but it sucked.

    550

    00:43:42,000 --> 00:43:43,119

    It was it was a year long.

    551

    00:43:43,119 --> 00:43:44,000

    It was very rough.

    552

    00:43:44,000 --> 00:43:54,224

    And, you know, to have a curative treatment happen in your lifetime, to be cured of a terminal condition in your lifetime, to be saved by so many different processes,

    553

    00:43:54,305 --> 00:43:57,425

    and then to see all of the components within that were involved.

    554

    00:43:57,425 --> 00:43:58,865

    And it was so complex.

    555

    00:43:58,945 --> 00:44:09,009

    And, really, just I'm here by by luck, by chance, by, by by will, by by by the by by such a combination of forces that I think that that shouldn't have to

    556

    00:44:09,009 --> 00:44:13,250

    happen, you know, for everyone to be saved from such a circumstance.

    557

    00:44:13,250 --> 00:44:23,835

    And so I think that one of the things that really drives us here is making sure that more people have access to life saving and life and quality of life improving medications.

    558

    00:44:24,235 --> 00:44:26,795

    Because the thing is they exist out there.

    559

    00:44:26,875 --> 00:44:30,635

    There are amazing things that are being invented and created all the time.

    560

    00:44:30,969 --> 00:44:35,610

    And our industry has shown that you can improve and save so many lives.

    561

    00:44:35,930 --> 00:44:44,969

    And I think that it is just a matter of getting the right alignment of things to make those costs and make that accessibility really happen.

    562

    00:44:45,324 --> 00:44:47,324

    And I think that's really a lot of what we try to do here.

    563

    00:44:47,324 --> 00:44:57,565

    We try to make things so this way the industry can have the opportunity to align itself better, to be a bit more smooth operating in terms of, you know, getting from, you know, getting from school to

    564

    00:44:57,565 --> 00:45:07,059

    working, having that efficiency gain, having the the the access to knowledge of production because we are pretty secretive in in the pharmaceutical industry and trying to share secrets.

    565

    00:45:07,059 --> 00:45:08,820

    We don't we try not to do that very often.

    566

    00:45:08,900 --> 00:45:10,579

    It is what we're kind of known for.

    567

    00:45:10,579 --> 00:45:11,300

    We have a process.

    568

    00:45:11,300 --> 00:45:11,780

    You're like, no.

    569

    00:45:11,780 --> 00:45:12,820

    This is my process.

    570

    00:45:12,820 --> 00:45:16,295

    And, you know, that can be difficult for advancement.

    571

    00:45:16,295 --> 00:45:26,295

    And so we wanna be a space, a place for people to be able to sandbox and try those things, to try the new projects, to be innovative in a way that maybe, you know, helps save and improve some

    572

    00:45:26,295 --> 00:45:27,094

    lives in the future.

    573

    00:45:27,349 --> 00:45:31,910

    And we just wanna be a part of this amazing industry that helps save and improve lives.

    574

    00:45:31,989 --> 00:45:34,949

    I would like to add something really quickly.

    575

    00:45:34,949 --> 00:45:38,630

    So I, you know, I got the training as a as a surgeon.

    576

    00:45:39,349 --> 00:45:49,025

    And this is an industry where a lot of time employees coming out of universities and they're trained by people who never worked in the industry.

    577

    00:45:49,344 --> 00:45:51,105

    They don't understand the problem.

    578

    00:45:51,105 --> 00:45:56,710

    So it's very hard for them to actually train on how to solve some of the problems.

    579

    00:45:57,030 --> 00:46:07,030

    So for me, the training aspects that I received during residency, that instead of trained in a classroom by people who never did surgery,

    580

    00:46:07,715 --> 00:46:17,235

    trained in a surgical suite by surgeons, why I am assisting and helping with surgery is actually, I think, a very critical thing.

    581

    00:46:17,235 --> 00:46:27,430

    And here in this industry, we're making medications and things like insulin gene therapy that goes from $05,000,000 to 5,000,000 a dose.

    582

    00:46:28,390 --> 00:46:34,710

    And yet people are trained in classroom by faculty who potentially never did this.

    583

    00:46:35,030 --> 00:46:45,134

    So I think changing the training and put them into a situation where they can learn by working with experts and all

    584

    00:46:45,134 --> 00:46:53,179

    of our faculty are coming from industry, I think, make a difference of what kind of workforce we develop for the future.

    585

    00:46:53,179 --> 00:47:03,355

    So I would like to see that students who graduate from university, even with PhDs, come and work in this kind of environment for

    586

    00:47:03,355 --> 00:47:13,675

    six months a year as part of their rotation because I think the translation and the CMC part of this new discovery is going to actually change

    587

    00:47:13,675 --> 00:47:19,730

    if we can incorporate something into as I said, even into the scientist training.

    588

    00:47:19,730 --> 00:47:24,130

    So they understand that how that idea can become a product.

    589

    00:47:24,130 --> 00:47:25,809

    And I think there's a huge gap there.

    590

    00:47:25,809 --> 00:47:37,355

    So these are the things that we would like to feel and be more like the, not the university, but the residency space for the workforce?

    591

    00:47:37,914 --> 00:47:39,914

    Well, thank you very much.

    592

    00:47:40,394 --> 00:47:50,599

    In summary, we've heard today about Open Biopharma, which focuses on operator and technician training

    593

    00:47:50,840 --> 00:47:55,559

    for pharmaceutical laboratories and manufacturing operations.

    594

    00:47:56,440 --> 00:48:02,855

    And you also do special projects, with a GMP facility.

    595

    00:48:03,175 --> 00:48:11,815

    So you're quite the resource, located North Of San Diego, but, you've you've got clients from across the country.

    596

    00:48:12,295 --> 00:48:16,680

    So I'd like to thank Susan and Richard for their time with us today.

    597

    00:48:17,239 --> 00:48:24,920

    It is great to hear about nonprofit efforts to train the pharmaceutical manufacturing and support personnel of the future.

    598

    00:48:25,594 --> 00:48:32,315

    That brings us to the end of another episode of the ISPE podcast, shaping the future of pharma.

    599

    00:48:32,635 --> 00:48:41,559

    Please be sure to subscribe so you don't miss future conversations with the innovators, experts and change makers driving our industry forward.

    600

    00:48:42,119 --> 00:48:52,420

    On behalf of all of us at ISPE, thank you for listening, and we'll see you next time as we continue to explore the ideas, trends and people shaping the

    601

    00:48:52,420 --> 00:48:53,620

    future of pharma.

Listen to Past Episodes

Audio file

In this episode, Susan Szathmary and Richard Jaenisch, both of Open BioPharma Research and Training Institute, join the podcast to share how to accelerate the adoption of new technologies through applied AI in pharma manufacturing and for workforce