Inside using labour market data to give a university an edge

Inside using labour market data to give a university an edge

SHARE PODCAST

In this episode of NousCast we head inside Australia’s RMIT University, which wanted to understand more about demand for skills in future so it could offer the right mix of courses today. 

Along the way we’ll hear about the some of the gaps with existing data sets and how more sophisticated sources of information laid the groundwork for cutting-edge analysis and a game-changing dashboard that let decision-makers have key information at their fingertips. 

In this episode we speak to Eloise Boyd, the Director of Market Intelligence and Proposition at RMIT, and Peter Ellis, the Nous Chief Data Scientist who led the project. 

About NousCast

The NousCast podcast brings you fresh thinking on some of the biggest challenges facing organisations today. In each episode of our third series, NousCast will feature interviews with Nous clients and consultants to a cutting-edge project, from the challenge to the approach, outcomes and lessons learnt. 

“There was often quite a big disconnect [between academics and marketing]. What we were trying to do throughout this process was bring those things together in a way that all those stakeholders could utilise disparate data sources to make decisions about market demand.”

Eloise Boyd, RMIT Director of Market Intelligence and Proposition 

Ari Sharp: Hi there and welcome to NousCast, brought to you by Nous Group, an international management consultancy. I’m your host, Ari Sharp. In this series of NousCast, we’re looking at some of the projects we’ve undertaken at Nous over past few years. You’ll get to meet the clients we’ve worked with and the Nous consultants who supported them to meet some of their biggest challenges.

Today, we are looking at RMIT University, which wanted to understand more about demand for skills in future, so we could offer the right mix of courses today. As we go, you’ll hear about some of the gaps with existing data sets and how more sophisticated sources of information laid the groundwork for cutting edge analysis and the game changing dashboard that let decision makers have key information at their fingertips.

Joining me is Eloise Boyd, the Director of Market Intelligence and Proposition at RMIT, and Peter Ellis, the Nous Chief Data Scientist, who led the project. Let’s get into it. Eloise Boyd, welcome to NousCast.

Eloise Boyd: Hi Ari, nice to be there.

Ari Sharp: Peter Ellis, welcome to NousCast.

Peter Ellis: Good day. Thanks. It’s great to be here.

Ari Sharp: Eloise, if I can start with you, we know that RMIT’s a global university of technology and design, and you’ve offer qualifications in a wide range of disciplines in Australia, as well as in Vietnam and Europe. Can you tell us a bit about how you measured market demand when developing new courses and training products? How did you anticipate the needs of future students?

Eloise Boyd: I think it’s important to remember we were working across different silos of RMIT. What was happening on shore in Australia was a little bit different to what we might have done in Vietnam or in some of our other offerings, but basically we have academics who, with their various subject matter expertise may suggest or might see a demand from a peak body and through their connections and industry relationships, define what the needs were going to be for a certain market.

Then we had marketing assessments and market intelligence information coming through identifying where we thought skills gaps and where we thought opportunities were for different program offerings and there was often quite a big disconnect in between those two things. I think what we were trying to do throughout this process was bring those things together in a way that all those stakeholders could utilize, disparate data sources to make decisions about what market demand would be available. We may use job ad information to determine how much demand is out there in a labor market. We may use information from the Department of Education and Skills to say, “This is the current size of that offering in that market”, and then look at a variety of other digital resources to determine, is there or is there not demand for a particular program.

Ari Sharp: Peter, if I can bring you in, can we talk a bit about the information needs in the circumstances that RMIT was facing? What are the key questions that could be resolved with data? Can you tell us what data sets you drew from and what set them apart from the public data released by the bureau of statistics?

Peter Ellis: When you think about any problem like this, things reduce down to a few basic questions like, “What do our central customers want? What are our competitors doing and what are we particularly good at?”. It’s worth saying that even though the attempt with this project was going to be tied together, a range of disparate tools and data sources, bits of analysis that RMIT were using. We weren’t trying to answer all of those equally, was really a focus on the, “What are the potential customers of RMIT want?” To a certain extent, what’s the competition out there doing? One way of looking at that is, you look at the skills market in a traditional supply and demand perspective. We know a fair bit about the supply side of skills, because we know who is generating it. We know how many apprenticeships there are, who is going through the vet training, who is going through university training and exactly what subjects they’re studying, because all of that is collected through administrative data sources, including the ones that Eloise mentioned. The Department of Education Skills, and Employment collect stuff, from the university, NCVR collects it, from the vet training providers and so forth. We know what is generating that. If you like, the number of current people who are employed, which is almost like, that’s the equilibrium bit of the market where the supply meets demand, how many people have actually got jobs. The cornerstone here is the ABS Labour Force Survey and that’s got to be the heart of any analysis. We use a lot in this tool. You can combine it with the sensors to get really good, detailed estimates of things that a survey can’t cover, or you can combine it with macroeconomic forecast to get future forecasts of how many jobs and so on are they’re going to be. That really just tells you, “Well, where does supply currently meet demand?” that equilibrium point, if you like. The other side of the equation is the demand from the employers in particular. If you look at what we are all trying to do in this, it’s really about trying to find out what people need to do to flourish in their careers and what work employers want their workers to be able to do, so that RMIT can target training, marketing to employers for that. What the employers want but isn’t currently being realized becomes key to this.

This is where in this project, we really relied heavily on a job advert data source that’s been collected by firm called Burning Glass Technologies. For a number of years now, they’ve been systematically collecting this data in multiple countries and it’s really just been a game changer for understanding the labor market.

They’re basically collecting the full text of tens of millions of job ads around the world and they’re classifying them using machine learning techniques into classifications of what occupation is the job, in what industry is the employer in and really critically detail that you’re never going to get from any other source like, what are the actual skills that the employer is asking for? Can you have a machine read the text and say, “Oh yes, they’re asking for whether it’s communication skills or Microsoft Excel, or must be able to ride a horse”. It’s there in the full text of the jobs and understanding that skills data gives us an amazing insight that’s just never been possible before we had this granular data.

Ari Sharp: Peter, you’ve got access to this rich data set. Can you tell us, what did you do next to make sense of it and present it to decision makers?

Peter Ellis: When I personally first came in to this project, there was already a vision in place, which I think it come from Eloise and some of her colleagues and some of the now education specialists who’ve been talking to them. I mean, I think I could simply describe that as the key bit of the vision was a scatter plot in which the individual points were particular markets that you might target training to. They might be occupations or they might be skills you can train people in. The idea was that we would have different metrics, which could be the different axis on this plot and they might be something that represents how difficult is it to fill jobs for this, or how long does a job ad have to be out there or how fast is this particular market growing?

Right from the start, this vision that really dictated this idea that, “Okay, we’re going to need an organizing principle of what these markets are to target at”. We knew very quickly from discussion what people were thinking of, they’re probably going to be occupations with thinking of targeting this at accountants or accountancy firms, but there might be other ways of breaking it down.

That was one unifying principle. The other was, we are clearly going to need to calculate in a really simple way a bunch of metrics, which we can have for each one of those different markets and from a lot of stuff flowed through.

Technically, it meant that the build process, when we got into it was going to have two quite large technical components. We’re going to need to pre-calculate all of these metrics for all of the different markets.

Then secondly, we are going to need a really powerful, flexible tool for putting it in front of the right decision makers. At a more project management or people level, one of the things that was implied from this was that, we could see that there’s a reasonably clear vision, people had sketched it out on a whiteboard, “This is what we want”, but we could also tell there’s going to be a lot of hiccups down the road, and maybe we’ll talk about them later.

We want to have a process whereby we can build something quite quickly and then get a lot of feedback on a weekly sprint cycle, and maybe we’ll talk a bit about that later. We are really keen on delivering working software quickly so that we could see what it actually looks like when you translate this vision into something that’s on a computer screen and see if that’s useful or not and what that means for the rest of the project.

Eloise Boyd: Following on from that, you could see that the scope that we’d identified in the initial visioning piece was quite mixed, and it was a mixed mixture between business to business targeting, and then also upskilling students in what careers would might be trending in a current market, and then also trying to understand and refine and revise existing program curriculum and softwares as well.

Even from our point of view, we wanted it to be everything and a silver bullet to do all those things and the iterative approach working through those use cases and those business questions, which was really critical in trying to refine it down to something that can actually answer the questions it’s intended to, rather than getting all of that scope creep as well.

Ari Sharp: Eloise, the work that you and Nous had done [inaudible 00:10:20] came together in the form of a dashboard and you were working along the way and pulling it together. What was the response like when you put it up to senior management and how’s it being used now?

Eloise Boyd: I think everyone was really pleased to see it being used because I think everyone’s mind is focused on making sure that we have got the right product offerings and the product program offerings in market to make sure that it is meeting the needs of the skills changes that are happening, but also recognizing that we did need to build that capability internally. We can keep using consultants and we can keep doing those things, but this is something. Program health checks and ensuring that we are meeting the obligations of ourselves as a public institution to offer something that where graduates are going to find employment and we’re going to make sure that it’s fit for purpose for them. It’s the right fit and context, and it’s going to also meet the needs of academics in defining and developing the curriculum that they do so well.

I think they were pleased to see us being able to use something that was data driven more than anything but consistency and capability building in-house for us to be able to scale that. I think that was what really helped get the cut through. I guess being able to train other users across the university and see that [inaudible 00:11:29] cropping up, you see the snips of it and you see the data coming from it, and you start to see this now dashboard referenced as something that isn’t input in it as a part of any kind of market assessment that we do now. I think that’s been one of the great successes of this.

Ari Sharp: What’s been the outcome then for RMIT from this new approach to informing course development?

Eloise Boyd: I guess some of the outcomes are a more consistent approach. I think that we’re able to utilize the data and expedite the assessments that we are doing. We’re not having to spend so much time in the gathering stage. We can do more of the analytics more quickly. It’s also building the maturity. What we’re seeing from that is better questions that are coming as a result of that. As a result, we’re working to refine the dashboard as it is to make it more fit for purpose because we’ve had experience using it.We’re now able to really say, “What are we trying to get out of it?” As I mentioned, we’re also now using it as a standard practice as part of our market assessments [inaudible 00:12:27] input into all the new program developments, but also the targeting that we’re looking to do for our audiences to understand what labor market, what industries are calling for to make sure that we are meeting the needs of employers and the needs of graduates coming out as well.

Ari Sharp: You’ve worked with quite a few organisations to help them use data more effectively. What are the keys to success in projects like these?

Peter Ellis: I would say the first key is to have a really clear idea of the decisions that the data’s going to inform. That’s something that in this case was really hands down solved right from the beginning, because RMIT had a really clear idea of exactly what decisions they needed, and they knew that data was available, new data was available to solve it. I think they had a great vision of how to do that. That’s a real success indicator, which is not always present.

The second thing I would say is probably being open to the possibility that the problem that really needs solving is almost certainly upstream or the problem or opportunity that you’re actually seeing. Typically, it’s not uncommon that we see an organisation where there’s been a bright idea, start from the top of [inaudible 00:13:46]. We want to use machine learning to improve our customer satisfaction or something.

You look at it and say, “Well, actually, yeah, sure. There is potentially an opportunity there but the machine learning is the least of your problems”. The first thing you need to do is maybe just get your data governance sorted about who owns the data and then some really basic data engineering and data management sorted and so forth. Basically, this idea that, yes, there may be an interesting, sexy opportunity, but quite possibly, there’s going to be some expensive, difficult grunt work to be done upstream.

In the case of this project with RMIT, I think we are lucky that because of the investment that been done by Burning Glass themselves to collect this data, but also by Nous when we already come into the project, we’d invested heavily in our own data warehouse to combine the ABS data with the education data and the Burning Glass data, all in a consistent approach. That was what made it possible to do at all.

The third thing I’ll say is just that when it comes to technical implementation of analytics projects, it’s really critical that you have something [inaudible 00:14:58] to what we call the Nous Process for Data Intensive Projects. Microsoft had this thing they call the team data science process. It’s very similar. It’s all just about having process [inaudible 00:15:08] your quality control, your [inaudible 00:15:10] control, making sure you’re using the right data, the team’s working together efficiently and so forth.

The final point I say is, just have a clear understanding of how the data needs that you’re looking at fit into a broader data strategy and in particular, a change strategy for how you are going to change the way that you use data, and that you’re aware of the trade-offs between things like quick wins versus deeper investments, or do we need specialist capability or you just need all the data available more broadly for everyone. Those are some general thoughts.

Ari Sharp: Eloise, if I can give the last word to you, RMIT’s clearly not alone in facing challenges on course development. What advice would you give to other higher education providers that are keen to use data, but aren’t sure where to start?

Eloise Boyd: I think Peter’s right. You’ve got to invest in the capability. You’ve got to not just go into things, thinking that consultants going to come and solve your problems. You’ve got to provide that in-house capability to be able to constantly refine and be the custodians of that information because it shouldn’t be something that’s set and forget, and it shouldn’t be something that just sits on a shelf or sits in a dashboard and isn’t used. It should be owned and should be alive.

The other thing I’d say is develop the business questions, make sure your use cases are nice and succinct, and that you’re asking yourself the why before you’re asking yourself the how, not just jumping straight to a technical or a technology based solution is… For example, straight to machine learning, when maybe that wasn’t what was required.

If you’re focusing on the why and why you need this information and how you’re going to use it, then you can focus on how the best way to develop that information is. I would also suggest to other universities, engage your community to showcase and signal to the wider community, why we should use data to drive decision making, but actually how we build a decision making culture and using this as one of the inputs.

Ari Sharp: Eloise Boyd and Peter Ellis, thanks for talking to NousCast.

Peter Ellis: Thank you.

Eloise Boyd: Thank you.

Ari Sharp: That was Eloise Boyd, the Director of Market Intelligence and Proposition at RMIT, along with Peter Ellis, the Nous Chief Data Scientist. You can connect with Peter by LinkedIn, and you can read more about an array of Nous projects on our website that’s www.nousgroup.com. We’ll put links in the episode notes. That’s it for this edition of NousCast, be sure to subscribe so you don’t miss an episode. We’ll catch you next time.

LATEST INSIGHTS