Sep 3, 2024
Get ready to dive deep into the future of Vocational Rehabilitation (VR) with Dr. Joe Ashley and Dr. Bob Schmidt in our latest episode!
Joe, the dynamic Project Director of the VR-ROI initiative at George Washington University, teams up with Bob, one of the leading economists and the Project Research Coordinator, to bring you insider knowledge on revamping return on investment models for VR programs. They’re on a mission to streamline and elevate how VR agencies operate, helping them become more efficient, effective, and impactful.
Their discussion is packed with actionable insights that will empower your agency to sharpen its data collection strategies, ensuring the true value of your services shines through. Plus, learn how to better communicate the VR success story to policymakers and stakeholders!
Tune in to discover how you can maximize your VR impact with the latest advancements from the VR-ROI project. Don’t miss out!
{Music}
Joe: We're trying to make sure we have information that the director can use with policymakers, and something for clients and counselors to use to say, yes, this is the kind of services we're looking for.
Bob: The model we develop is based on readily available administrative data.
Joe: It's built on the individual customers and how well they do and what their outcomes are.
Bob: The human capital development, that's what it's all about a lot. Some things just aren't measurable. So when you mentioned financial return on investment, that's what we're talking about.
Joe: If you can't capture it, you're not able to tell the story.
Carol: Yep, if it isn't documented, it didn't happen.
Bob: That's right.
Joe: Yeah.
Intro Voice: Manager Minute brought to you by the VRTAC for Quality Management, Conversations powered by VR, one manager at a time, one minute at a time. Here is your host Carol Pankow.
Carol: Well, welcome to the manager minute. Joining me in the studio today are Dr. Joe Ashley, the project director for the VR Return on Investment project based at the George Washington University, and Dr. Bob Schmidt, one of the five economists working on the project and the project research coordinator. So, Joe, how are things going for you today?
Joe: Today they are doing really well. Thanks for asking, Carol.
Carol: Nice to hear it, Joe. and Bob, how are you doing?
Bob: I'm doing well as well, at least, as well as Joe is doing.
Carol: That's awesome. Alright, glad to have it guys. Okay, so for our listeners, Joe is my colleague and we got him out of retirement to serve as the project director for this important initiative. And this project is funded by the National Institute on Disability, Independent Living and Rehabilitation Research, also known as NIDILRR. Now, this is the federal government's primary disability research organization and is part of the Administration for Community Living. Now, NIDILRR's mission is to generate new knowledge and to promote its effective use to improve the abilities of individuals with disabilities to perform activities of their choice in the community and to expand society's capacity to provide full opportunities and accommodations for its citizens with disabilities. NIDILRR achieves this mission by funding research, demonstration, training, technical assistance, and related activities to maximize the full inclusion and integration into society, employment, independent living, family support, and economic and social self-sufficiency of individuals with disabilities of all ages. They also promote the transfer of, and use and adoption of rehab technology for individuals with disabilities in a timely manner, and also ensure the widespread distribution and usable formats of practical, scientific and technological information. And they do address a wide range of disabilities and impairments across populations of all ages. Now, Joe, I know you have a little disclaimer you wanted to make.
Joe: Yeah, I just want to be sure that people understand that what Bob and I are going to talk about today is our opinion of what return on investment should be, and is not necessarily reflect what NIDILRR is looking at.
Carol: Excellent. Well thanks Joe. Let's dig in. So, Joe, why don't you kick us off and tell us a little bit about yourself and your journey in vocational rehabilitation?
Joe: Carol, I've been in rehabilitation for quite a while. I worked with the Virginia Department for Aging and Rehabilitative Services, the general agency in Virginia, for over 25-27 years, most of the time as an assistant commissioner in a variety of roles. I have a master's in rehabilitation counseling from the University of South Carolina. That sort of got me focused on vocational rehabilitation. And then later I had a Doctorate in rehabilitation from SIU at Carbondale that took me on a path of looking at program evaluation and program development. When I got to Virginia, I was working out of the Woodrow Wilson Rehab Center, now called Wilson Rehabilitation Center, and was working in a program that was collaborative across, it was one of the early transition grants, 1985,and it looked at vocational evaluation as a part of a process to help kids learn what they needed to do. And we were working with students from special education and vocational education in the schools, and vocational rehabilitation, and getting these systems to collaborate to help kids find out what they want to do and to be successful in employment and in life. And I got to where I really enjoyed that kind of collaborative work, and I ended up as an assistant commissioner in the agency, looking at developing innovative new programs as a part of my responsibilities and looking at a lot of the ancillary support services like rehabilitation, engineering and other kinds of things. Through a series of circumstances, I ended up as the director of the field services for four years, where I began to get a good sense of what disabilities needed to be in terms of supports to be successful in employment and being able to live successfully in their communities.
In addition to that, what counselors and other staff needed to be able to provide those services to them. And then I got into the job that was my favorite, which was something called grants and special programs, where I did a lot of the Social Security stuff, cost reimbursement, work, incentives specialist advocates. We created a new system there to do fee for service for the work incentive services. We did a lot of work with the workforce agencies. I did all the agreements with that, and then I got to do grants and any of the grants that helped people with disabilities be able to live and work and thrive in their communities were things that we were willing to support. And I got to work with a lot of different funding systems and across a lot of different systems, you know, Special Ed workforce systems, behavioral health, a lot of different groups to help people with disabilities have opportunities. So that's what I really enjoyed. And that's where I came across the late doctor David Dean and then Bob Schmidt as a part of that package with Dean. And it was about telling the VR story. And I got real passionate about how do you tell this story in a way that is going to get people like GAO to pay attention, as well as help directors with policymakers and individuals and counselors help make decisions about what's a good choice for them. So that's really how I got to where we are today with this new grant.
Carol: Very cool Joe. I know we all look to your program in Virginia for kind of the cutting edge stuff that was happening, because you all seem to always have just something cooking.
Joe: Yes.
Carol: It didn't matter what. And especially like the disability work incentive stuff that you were talking about and all of that. Oh gosh. I just think you've done a lot of stellar things there.
Joe: Well thank you. It was fun.
Carol: It's awesome. So, Bob, tell us a little bit about yourself.
Bob: Sure. Happy to. Joe mentioned Doctor David Dean. He was a colleague of mine in the Department of Economics at the University of Richmond. He worked on what he called economics of disability, and he started working on that in graduate school at Rutgers with a faculty member there. And he worked on that. So that was in the 1980s. He came to the University of Richmond, and he got me interested in it because he was an outgoing, gregarious, very bright guy and made friends easily. So he got me involved in this probably early 1990s, and we started working with DARS and several other things at the time with Joe, but also Kirsten Roe. I don't know how many people remember her, but she was instrumental in all the work we did. So this is actually our third grant with NIDILRR. The first one was a demonstration grant. So it's a kind of a proof of concept. Second was implementing it. Now this one is refining it and taking it to the next step. That's what we're trying to do with that. So David got me excited about it. Joe keeps me excited and he keeps me honest.
Carol: That is awesome. Well, I know just being around the director ranks for years and folks talking about return on initiative, it's been a, you know, a hot topic. People chat about it, but I don't know that everybody always really understands it. And I think sometimes people think maybe it's something that it isn't and they aren't very good at explaining it, but everybody wants to do it. So you guys are going to unpack all this for us. Joe, why don't you tell us a little bit about the project and what you're trying to accomplish?
Joe: Well, with this current iteration. It's what NIDILRR calls a field initiated project on their development side, and it's got a ridiculously long title. So I'm just going to say it is about updating and simplifying our return on investment model. That's its main purpose, and it's about helping our agencies understand what they can do to be more efficient and more effective, and take a look at the mix of services that they provide, to be sure that they are getting the most out of the resources they have to help people with disabilities obtain, you know, that probability of employment and upon employment, their earnings. And we're trying to make sure we have information that the director can use with policy makers, that agencies can take a look inside their own services to say, maybe I need more of a particular type of service because I'm getting good outcomes, or maybe I need to tweak a service because it's not getting what I want it to do, and then something for clients and counselors to use to say, yes, this is the kind of services we're looking for. We got four goals, and the first one is just really to update the model. Our previous model was prior to WIOA implementation, so what we hope to be able to do is take a look at the data systems and take a look at the performance indicators that WIOA requires. And we can do a correlation, perhaps with the long term employment to see how well they're correlated.
Also take a look at Covid impact. The second goal is about intensity. Our other model is you either got a service or you didn't. And if you got the service then how did it affect employment and earnings? Well, the next logical step according to The Economist and we have five on the project as you mentioned earlier, was what is the intensity of the service. Does that make a difference. So that intensity measure could be hours of work. It could be what it costs to do something. It could be units of service. And taking a look at if that is related to the propensity for employment. The other piece that goes with that is how about internals provided services, what we had before in the system, nobody had good measures of the services their own staff provided. So we're hoping with what we're seeing now and we're working with the two agencies in North Carolina, and they've been extremely helpful and collaborative with us on this process is take a look at the internally provided services and see what impact they have on the employment and earnings side of things. And then we've been told many times our third goal is simplify the model. Right now it takes economists to run it. Well that's not always a good idea for some people. So what we're trying to do is see what econometric models could we put in place to simplify this process so that it's more available to rehab agencies. But you want to make sure it's still rigorous enough to give you a reliable estimate of return on investment.
So one of the things we're having with that is many of the folks on the who are listening to the podcast may be aware that we did a data analysis and management capacity survey that CSVRA sent out. Our advisory committee supported, and with that, we got 54 agencies to provide us information on what their data capacity is and what this capacity of their staff is. And then what kind of training they might be interested in. We're still looking at the data from that and we'll have some information on that later. But what we find in this may make a big difference on how simplified the model can be, or whether we need to take a different track to help people be able to implement a new model. And then finally, it's about knowledge translation. And part of that is coming to us like we did a consumer and stakeholder forum with the North Carolina State Rehab councils and some other stakeholders to get input on what they'd like to see, what kinds of information and would this information be helpful to them. And then we're going to have another consumer and stakeholder forum probably next spring to say, here's the model as we have it so far. Does this make sense to you and would this be valuable to you? So those are the big overriding goals that we have for the project.
Carol: I really like that you guys are digging into the capacity that agencies have, you know, with that data analysis, because I'm just thinking definitely, as I've been out across the country that you've got to have and the have nots. I mean, there for sure. are folks, I think of our friends in Texas and they have a lovely team there. Just they have like an amazing...
Joe: Oh yeah, they do.
Carol: ...resource team. And then you've got other folks trying to scrape together kind of a half of a position that can maybe do a little smidge of a little something around the 911.
Joe: they may have a resource like a data system, but they don't have anybody that can run it, or they may have staff with the capacity to do the data system, but they don't have the system. I mean, it's a lot of different variables there.
Bob: I'd like to jump in here just on one thing, which was on the simplified VR model. So the model we've developed, thank God it was by economists, is we're trying to address the question here. The goal of the program is to get people into competitive employment or keep them in competitive employment. If they already came into the program with it, maybe build on that. So there are a lot of things that are correlated with how well you do in the labor market, gender, race, Age, education level. All things are correlated, right? And maybe service provision in the VR program. But we'd like to take it from well, it's correlated, but we don't know exactly how or why. In the same way you can say, well, provision of this specific type of service leads to improvement in the labor market, leads to a greater likelihood of obtaining competitive employment. Now that's a different issue. Now the way you normally do that, the gold standard is a randomized clinical trial, right? Where you take people and you randomly select them and it's double blind. So neither the researcher nor the individual involved in the experiment know who's receiving the treatment, or who isn't.
Well, that's clearly impossible in VR. First of all, it's illegal to deny service to someone who is eligible and for whom you have the money. But secondly, it's impossible. So what you have to do is you have to impose statistical controls somehow. You have to do it through some sort of statistical model. And we've developed one which is state of the science. What state of the science inherently means that not everybody can implement it. So even at some universities, they aren't able to implement this particular model. And so we wanted to ask the question, could we come up with a simplified version of this model, a simpler model that can be used possibly in a VR agency or possibly at a local community college or university, something like that. And they could get similar results. So we wanted to see how could we do it? Is that a possible goal? What do you lose when you do it? Does it do a good enough job, or what kind of qualifiers do you have on it?
Joe: Where are the tradeoffs?
Bob: Yeah, what are the tradeoffs? That's a simpler model we're trying to do.
Carol: Should we talk about the model you developed now? Do you want to talk about it?
Bob: That'd be fine. Sure.
Carol: Let's do it.
Bob: Okay. One of the things is that the model we developed is based on readily available administrative data. What that means is you don't have to run a survey. You don't have to go out and do a very expensive sort of research project to find out what's going on. Instead, we use data from agency's own data system, which they collect to report to the Rehabilitation Services Administration, (RSA). they have really, really very good data. The RSA forces them to collect very good data. In fact, for some of our economists, their eyes just lit up when David told them the kind of data that he was able to access it. Whoa. That's great. So there are two levels. One is you get data from the agency itself, and then they will provide data to us that they provide through the quarterly RSA and nine over 11 report to the RSA. And more than that. So we get much greater detail than that if we know how to use it. If we can identify and know how to learn how to use it. And then secondly, all the agencies have given us access, been able to give us access to unemployment insurance sort of data. So quarterly data on that and what the RSA collects upon closure. They're mandated to follow employment and earnings for four quarters after closure, but we don't think that's long enough, especially since WIOA was passed Workforce Innovation Opportunities Act and changed the mandate to work on transition age, transitioning students with disabilities or providing those sorts of services.
Well, if you're going to start working with young people who are just entering the workforce, or you're providing college level education or skilled training services to any age. You can't just follow them for four quarters. I mean, if you're just entering the workforce, you're not going to enter it at the highest levels of the workforce, right? So if you want to know what the real impact is, you have to follow them longer. So with the unemployment insurance agencies, we've been able to get quarterly employment and earnings data from 2 to 3 years before they even applied to the program. That's kind of a baseline. But what are the services do to you? How do things change? Well, that's your baseline three years before application. Then we try to follow them for at least five years after application at least. Now the current one starts in 2018. So the earliest applicants we have from 2018, and then we collect all applicants between 2018 and 2021. So already it's a stretch to get five years of data. But we had to start that recent because we all wasn't fully implemented effectively until 2017, 1819. In fact, the fellow North County says preferably 19 or 2021. But then you don't have, you know, this thing ends in 2025 and you don't have enough data, enough tracking.
So that's the first thing, is readily administrative tracking earnings over a long period of time, as long as possible. Another thing is generally the way these things are done or have been looked at is you look at the VR program as a whole. You don't look at by discipline, you look at the agency. These are people who apply for services, and these are people who got to the point where they got a plan or plan for employment services. And then how do they do? We look a little differently. We look at by disability type. First of all, we look at for broad based disabilities folks with a cognitive impairment. And that could be an intellectual disability or a learning disability. Folks with a mental illness. And then also we try to find out how severe that mental illness is. Folks who have a physical impairment and folks who are blind or visually impaired or otherwise visually impaired. So we look at and we estimate those all separately because we think services are assigned differently by disability type on average. And also the disability type affects how you will do in the marketplace, for example. What we found out was for folks with physical impairment, unlike folks who have a cognitive impairment, cognitive impairment might be with you since birth, perhaps.
And so therefore you kind of have a steady level of earnings at a certain level. But if you have a physical impairment that often comes on very quickly, very acutely, very quickly. So all of a sudden you see their preapplication Application for earnings pretty good. And then boom there's a big plummet, right? And so then you have to do something different with the track that the pre-application earnings. So that's the second thing. The third thing is that this idea that these folks, we look at the folks who received,, who had a plan and therefore received services, we compare those people who didn't have a plan and didn't receive services. So he received service, he didn't. Or, in economics or the social sciences, you call it a treatment group and a comparison or a control group. Well, we thought you could do a little bit better than that. What we look at is we look at anywhere from 7 to 9 to 10 to 11 different types of services things like diagnosis, medical treatments, college education, training, all those sorts of things. We say, first of all, how is the decision made that you're going to receive this type of service? And then secondly, what impact does it have? So what factors influence the decision to We see what type of services and what impact does that service have in the labor market on gaining and keeping competitive employment.
So we look at that. So we look at different types of service. So you can see already it's a much richer type of analysis therefore much more complicated types of analysis. And then the last part is that we built sort of a state of the science model. And that's what makes it complicated for many people to try to implement. And by that we mean that this correlation versus causation. So instead of doing a randomized clinical trial you have to take the data as you receive it. So therefore you kind of build control by saying how do you control for different things that might affect this that you don't observe. Now one of these might be motivation, right? So if you have someone who's particularly highly motivated that will might lead them to both apply to a VR program and a plan, follow through and move on, successfully complete the program, and might also quite separately, whether or not they receive services. It helps them in the labor market, right? Because they're motivated to succeed. So how do you distinguish those things? That's tough. You do randomized clinical trial. You can't because both types people end up in both parts motivated and unmotivated. So we have to impose this controls. And that gets a little complicated. So that's basically the model is then once you're done. So then we get impacts by type of service.
We also collect cost of providing those services. Cost of the program. We have those impacts. We let them spit out and say what would happen if they kept getting this benefit level for the next five to 10 to 15 years? And then you have to do some what's called discounting in technical and finance and econ. So you do that and then you say, okay, this is the total gain from that service or actually from all the services combined. And this was the cost. And the difference to that is kind of cost versus benefits, right? Hopefully the benefits exceed the costs, right? And that's how much they've gained because of the service per versus both the. That's essentially what you do. And the other thing about that is we can calculate that for each individual in the sample. So we have individual level returns on investment individual level benefits or effectiveness. And you can then aggregate that up and say okay agency wide. This is what it looks like. The agency's return on investment for a particular disability. That's what their return on investment look for males their females. Any group you want to do you can just do it because we have the individual impacts of it. So that's the model. And we want to see whether a simplified model can get us similar sort of information.
Joe: One of the things, Carol, that I find compelling about the model in particular is something Bob just pointed out, and that is it's built on the individual customers and how well they do in this process and what their outcomes are, and it builds up. So it starts at that individual client level. The other thing, when the economists were developing the model and they were looking at the data of people who went through the system, they observed that there's a lot of variability in the types of services that are provided. So they built the model around that variability of services. So that individual service model, that is VR is what makes the variability work for this model. So it's very much tied to the core tenets of the VR program, that individual services model. And that's where the variability comes from. And that's why it can give us some causation. So I think it's really important to note that it is consistent with how we do services and how we provide what we do. The other thing I will say about The Economist is they have been dedicated to understanding how VR works. They often in the early days when we were going out, they would sit down with the agencies and say, does this make sense to you? And then they would look at the model to see what would make it make more sense in terms of telling how VR works or the outcomes of VR. So they've spent a lot of time trying to understand the system and get knowledgeable about how VR works and what the opportunities are, what the process is, so that what they're modeling is consistent with how we do business. So I think that's a key component.
Carol: I think that's really cool that you said that, Joe, about taking it back to the individualized nature of the program because VR, you know, you think about it in an aggregate, we get this big $4 billion in a lump. And, boy, each person's experience within that is so individualized. It is, you know, whether you're getting this or that, you know, are you getting educational sorts of services and access to training and post-secondary and all kinds of different things? Or are you a person on a different trajectory, and maybe you needed some medical rehabilitation type of stuff going on? You needed something completely different. Like, people have so many ways to mix and match and use the things they specifically need to get where they need to go. You probably can't do it unless you get down to that level. So that is very interesting. Now, Joe, I know we've talked about this in our team a little bit even. And I know you said you wrestled with your group, but this whole notion of return on investment or taxpayer return on investment has been a really interesting topic and is fraught with some issues itself. And I remember coming into Minnesota and the general agency director like taxpayer return on investment, and I was brand new in the program. I'm like, I don't even know what you're talking about right now, but a lot of times you tend to hear it discussed that way. But I know, Joe, you've said there's a lot of issues around this. So what are some of those issues?
Joe: It's an interesting little issue. The very first meeting we had, it was at Carver, and we had a number of people from different agencies and state rehab councils come into a meeting, and we were laying out the first model. And one of the directors at that point said, well, are you doing a taxpayer return on investment? And by that he meant returning Taxes, increase in taxes, receipts going back to the Treasury. And that was his definition of it. That was the first one. And then when we were in North Carolina at the consumer forum that we did the stakeholder and consumer forum, we got the question from some advocates and said it doesn't seem to go away. We always get that question, but the issue is what is the appropriate way to determine the return on investment for a particular type of program. And it was interesting. We got this question so often, even from some of our workforce friends that are the economists said about writing a paper to describe why taxpayer return on investment is not appropriate for a VR type of program. And they submitted it to, I think it was three, maybe four different econ journals, and some of them didn't even send it out for review. They said, this is already settled. It's not appropriate for this kind of program.
So the issue is another workforce programs or human capital development. And the purpose of a human capital development type of program is to in our case, find people employment and look at that probability of employment. And then conditional on that earnings, if you've got people in your system and they're entry level, a lot of them are not going to be at the level where they pay any kind of taxes at all for several years. So you really don't have a lot to show when you do taxpayer return on investment in terms of that. Also, one of the things that we noticed when one of the studies that was done is that in some cases, and this is with a particular type of one of the particular disabilities, is the only one they looked at this with when we had some Social Security earnings available data available to us for a short while. Not only do we get people off of Social Security benefits, but we also find people that go on to Social Security benefits from being involved with VR, and that often makes them more stable. So then they can then participate in a VR type of program and be successful. But it's a long, long term process to do that. So in the short term, you're not going to show anything but about as many come on as go off. So you're really not showing that. But if you're doing what the authorizing legislation says you're supposed to do, which is get people employed, let's just take it down to a simple level and then the question becomes, are you efficient and effective in that process? And that's what this particular return on investment model is about. And that is what the economists would say is the appropriate way to look at this. Now they would call this a social welfare type of program is the category they put it in. And then human capital development. But there's other kinds of benefits that accrue to the individual. Because this model, this type of approach looks at it benefits to the individual and to the society in general, which is the individual being employed. And in this case, there are other benefits that we can't observe. Self-confidence would be a good example. Quality of life would be a good example. So in our case, what we're able to observe is how they're interacting in the workplace. And that's really the piece that we can measure. And that's where we're going with this. And the others might be important, but very few places have really figured out how to measure that.
Carol: Well, Joe, I actually I was telling Bob before we hopped on, I said, you know, I threw something in ChatGPT because I was like, all right, VR return on investment. Explain it to me. And ChatGPT it spit out. It talked about financial return on investment, you know, with employment earnings, cost savings. But it was talking about social return on investment, improve quality of life, community contributions. You know people experiencing that enhanced self-esteem, independence, all those things. And then personal return on investment with skill development, career advancement, those kind of things. It was just kind of fun to run it through and go, hey, yeah, because I know you guys have wrestled with like, what are you going to call the thing? Did you come up with like the name, The Thing??
Joe: Yes, it's interesting. I think what we came down with is that we think the vocational rehabilitation return on investment is the name we're going to stick with. And then say, you know, what we have is a human capital development project, and that's how we're measuring it or return on investment. But what we're going to have to do this is so ingrained in the culture of VR that you've got to return taxpayer dollars. Well, that's really not what VR says it's supposed to do. And so how do you get people to understand that that's not the appropriate way to look at the VR program. So we're going to have to do some education. I think about what return on investment is. And I may use your ChatGPT story...
Carol: Yeah.
Joe: To ...tell it.
Carol: Bob, I see you have something you want to jump in with.
Bob: Yes, and I think well, I have several things. One is I think the reason it's so ingrained, I think I might be wrong. Joe can correct me is because agency directors have to testify before the state legislature to get the money they want from the state legislature, right? And say the legislature, at least for a while. I don't know if they're still doing it. They're saying, yeah, but what's the return to the taxpayer on this? Why are we funding this if it's a money losing proposition Well, that's the thought process. But the problem with that is the state legislatures are kind of going against the odds. The federal authorizing legislation, you know, VR dates back to again, Joe can correct me. After World War One, when veterans came back from war and they had some severe physical injuries, and the federal government said, well, let's try to get them services to help them vocationally help them get back to work, get a job, and keep it so that they're effective in the workplace. Well, that thing was incredibly successful. So over time they said, well, this works so well. Can we expand it to other disabilities? Maybe states want to get involved in this as well. So what's happened over time is every one of the 50 states has this kind of co-funded arrangement with the federal government. And the Rehabilitation Services Administration oversees it, where they jointly sponsor these things, and it now covers many disabilities. Some states have more than one agency, one for the blind and visually impaired and one for the general. Other disabilities. So it goes back that far. And the authorizing legislation says is specifically provide services to help the individual gain and maintain competitive employment. And we're back down to the individual with that.
It doesn't say to pay for itself to the fed, to repay the state or federal government for those services. So that's one thing. It's not what the metric to do it by. A second thing is, I mean, I never did like the social welfare. I'm an economist who would never call this a social welfare program. First of all, welfare has a negative connotation, even if its denotation is not negative. It's social improvement or anything. But it's really less a social more. As I said, the human capital development, that's what it's all about. And he also mentioned the issue that a lot of some things just aren't measurable. So when you mentioned financial return on investment, that's what we're talking about. Is the agency doing its job of getting people back to competitive employment and leading a better life, and maybe freeing up some of their family work to do other things. There might also be a multiplier effect in the sense that they earn more money, they spend the money. Other people, as a result, earn more money. And economists call that a multiplier effect. So that dollar has more on it. But it wouldn't get measured in this taxpayer return on investment at all.
Carol: Okay, cool. So I know you guys have made some interesting observations in reviewing the data and looking at some of the longitudinal data. What kind of things are you guys seeing?
Joe: My observation is that it concerns me that some people we've learned recently that some of the states aren't capturing data after the fourth quarter after exit in terms of UI data. I know one state that is capturing going for that after the fourth quarter for their Social Security cases, because it helps them obtain more resources through cost reimbursement. But I think that we're underselling the value of VR when you only do the fourth quarter up to four quarters after exit. And I realize that's a lot more than we used to do. But on the other hand, it's probably not the best way to tell the VR story, because you just don't capture everything. And younger population exacerbates this. You just don't capture it with all the impact of VR can be for an individual over time. So I think that's one of the things I have seen. We had a study we did from a long time ago, from the first since I did with David, Dean and Bob, where we had a program, that transition program, and the students that participated in it were focused on post-secondary opportunities, and they were measured against the counterpart group that went in the VR system of youth. And the other kids typically went to work faster than the participants in this program. But at year six, after application, the perk students took off in terms of their employment, and the other kids just they were still employed and they were doing well. But the perk kids took off with this post-secondary approach, which is what we're being asked to do now. And you really wouldn't have told the story if you only went for five years after application. So those are the kinds of things that I'm concerned about with the longitudinal data.
Carol: Joe, so what about this to with it. You know, like especially blind agencies tend to provide a lot of the services themselves. What kind of problems are there with that and not sort of capturing the data?
Joe: We have seen that as an issue with the 2007 data set. We have in the 2012 data set, we had and our colleagues in the blind agencies were very clear that there were services that they were providing that were critical to successful employment and adjustment, but we didn't have any way to capture it. And so you're, again, you're undervaluing the impact of those agency provided services by not capturing them. And I think that's going to be critical. I think there's some requirements now that they have to be reporting some of this information, but it's a question of whether it's getting into that case management system and it becomes readily available administrative data that can be used to help tell the story of the impact of the great work that these counselors and other kinds of specialists are providing to help people become employed and adjust into their settings. Bob, you want to talk a little bit about what you're seeing in the data?
Bob: Well, yes. And now with the new data set, RSA 911, that quarterly report that all agencies have to provide and again for four quarters after closure that thing now they've made some changes and it's now required whereby types by 32 different service types they report. Did you provide purchase services during the quarter. If so how much did you provide it in-house or was it provided through a comparable benefit, some other external agency and that might have a dollar value attached to it? So we're going to use that data and see what we have. Now of course with any data set. Now I'll tell you purchase service data that's pretty reliable because they need to get their money back, right? They need to get reimbursed. They need to pay the bills. And so they track that through their accounting system very well. But the other things are and had entered often by counselors who are harried and busy and have a lot of other things to do, rather than this bureaucratic kind of form filling out, so it's only as good as the data that are put into it, and we won't know how good that is, but we'll see how much we learn. this way, hopefully we'll learn some things we didn't know.
Joe: What we have been told is that the data is not there for us to capture, and that it undervalues the kind of work that's being done. So we're hoping we can find a way to tell that story, because it sounds pretty important. And then from my personal experience in managing some of these services, I know how hard these folks work and how valuable these services are. But if you can't capture it, you're not able to tell the story.
Carol: Yep. If it isn't documented, it didn't happen.
Joe: Yeah.
Bob: That's right.
Carol: So what are the next steps on the grant and how can we get folks involved? Are you needing people to help with anything, any states or anything we've got?
Joe: North Carolina is, we're working very closely with them and they've been really good to work with. We will be once we get the prototype, I don't know what to call it. The economists are putting together the data system information so that they can begin to apply the new model and that'll be happening hopefully within a couple of months. And then once we've run the model a couple of times, we'll be asking some other people to come in sort of a national audience to take a look and hear what the model is, what it offers to get their feedback on. Yes, that would be useful or that doesn't seem to work for me much. Could you do this other thing? And then we'll also be asking them about. We'll be showing them what we've come up with for the simplified model to see if that version is going to work or if we need to be developing maybe a template RFP for them to use with a local institution that they work with, then they would be able to get the data set. So we're going to be looking at that. We may be asking folks to work with us a little bit on the capacity survey, where it talks about the training that states might be wanting to say, who can provide this kind of service, and would this be valuable to do to increase people's ability capacity? Because there's a lot of data needs out there. And I think if it would help our project, it would probably help a lot of other projects as well.
Carol: So, Joe, are you thinking about that for fall, possibly at CSAVR or something?
Joe: That's November. That should be a time when we would have an opportunity to gather some information. Yeah, because we might be ready for it by then. Of course, that might put a little pressure on the economists, but I don't mind doing that.
Carol: Yeah. Bob's looking like, oh well okay.
Bob: You love doing that, Joe. I mean, one of the things my major professor in graduate school always said, I love working on a research project where I learn something and what Joe said is exactly right. So we would take and vet our results to various agents. We may make a trip to the agency before Covid. We go and we sit down. We go through everything, explain what we're trying to do when we sell. And then they would say, that looks a little wonky or something, or did you do this? And you say, no, we didn't do that. Yeah, we could do that. Let's do it. And then we would revise the model or no, unfortunately we don't have enough information to do it. Could you collect it? You know, that kind of thing. So yeah, we keep learning things and that's what these groups are intended. That's what they're for. For our selfish purposes. That's what we like about them.
Carol: That's excellent, you guys.
Joe: So November would be good, Bob.
Bob: So you say.
Carol: Well, I'm definitely looking forward to seeing what comes out of all of this. And you were saying that the end of the grant then is in 2025.
Joe: August 31st of 25.
Bob: Right.
Carol: All right. That's coming up quick you guys, really quick.
Joe: Oh it is.
Carol: Well, awesome I appreciate you both being on today. I cannot wait to hear more as this unfolds. So thanks for joining me.
Joe: We really appreciate the opportunity.
Bob: Yes we do.
{Music}
Outro Voice: Conversations powered by VR, one manager at a time, one minute at a time, brought to you by the VR TAC for Quality Management. Catch all of our podcast episodes by subscribing on Apple Podcasts, Google Podcasts or wherever you listen to podcasts. Thanks for listening!