24 Feb Unlocking the Potential of Government Records Data
Data is what you need for analytics, but transforming data into information is what you need for business. Analyzing records management data and transforming it into information not only helps with process management, it also helps identify areas for process improvement. Efficient processes help you get the most value.
Key Takeaways include:
- The difference between data and information
- The importance of measurable data
- The steps for process improvement
- How to transform electronic data into information for decision making
Unlocking the Potential of Government Records Data
Presented By Keli Pisciotta (Access Sciences) and Brad Doucet (Louisiana Department of Transportation)
So let’s dive right in. We’ll start with our session, Unlocking the Potential of Government Records Data. And I’d like to briefly introduce our speakers for today. We have Keli Pisciotta, who is a Project Manager at Access Sciences and Brad Doucet, who is the Director of Enterprise Support Services at Louisiana, Department of Transportation. Keli and Brad, if you’d like to take it away, the floor is yours.
Thank you. Good morning, everybody. Thank you for joining today. Today, we’re going to go over a few takeaways for this session. They are data versus information, why measure? Change, management, process improvement, and then when we get finished with that, we’re going to show you two live examples of Louisiana State Agencies of how we’ve turned data into information for these agencies.
Keli, I hate to interrupt, but it looks like you want to try sharing your screen again, we’re not seeing any slides.
Can you see it now?
We see your PowerPoint app and now it’s in presentation mode that looks great. Thank you.
Sorry about that. Okay, so again, we’re going to cover data versus information. Why measure? Change, management, process improvement. And these are four methods that you can use to turn your data into information. And then in the end, we’ll show you some live examples. The first thing I want to go over is the difference between data and information. Basically, data is what you need to do analysis and information is what you need in the business to make decisions. So, think of data as a bunch of multicolored unorganized blocks, just sitting there by themselves.
They really don’t lend themselves too much decision makings, not until they are organized, sorted, arranged, and presented in an orderly fashion that it becomes information, and it provides context for your data in order for you to make decisions. Data needs to be accurate, it needs to be consistent, and it needs to be timely. In today’s world, there is no shortage of data anywhere. In any company you go to, I can guarantee you’re going to find multiple databases with different systems. Nothing is in one single system. You’re going to have Oracle databases, SQL databases, access, and maybe Db2 databases.
So, all your information is scattered throughout different systems. Add on top of that, social media nowadays, we monitor Twitter and Instagram, and Facebook. All of these different social media accounts, where data is constantly coming in at a very rapid pace. And on top of that, you have to add email that you have to go through that has information as well. When the amount of data exceeds your capacity to analyze it and process it, and turn it into information, then you get information chaos. When that happens, your analyst will get burnout, trying to figure out what data to use and the person needing to make the decisions is not going to know which information to pull.
There are multiple graphs and none of them lend themselves to the decision that they need to make. So it’s going to waste a lot of time. So what you need to learn to do is to filter out the data that you do not need. And this is something that you do in everyday life. For instance, when you get in your car and you get on the interstate and you’re driving home, there are millions of advertisement signs, speed limits, exit signs, but you know the way home. So you don’t pay attention to any of those. You get in your car and you’re kind of in auto mode and you get home every day.
But if you go on vacation and you’re taking a long drive in a place that you’re not familiar with, someone in the car, I guarantee is going to have to go to the bathroom. So then at that point, you’re going to start looking at the exit signs to see which one has a restroom that you can use. And then you filter that data in and you start using that. So it’s something that you do in everyday life. There’s a term for this, it’s called DRIP – data rich and information poor. So I’m going to go over a few pointers to help you avoid this. One of the first things is to provide timely information.
For instance, at DOTD, if you’re doing an application to show road closures, you want that data to be refreshed as often as you can, make it as most real time as you can. But if you’re doing something where you’re reporting month end data, you don’t need to refresh that until the numbers come in at month end. You always need to provide balanced information, a lot of times you’ll be at analyzing the information and you don’t exactly get what you think management wants to hear, but you need to present it anyway, because it’s outliers like that in the information that help you make process improvements and improve your business.
You need to keep it clear. I had a supervisor one time that told me the higher up you go to present information, the less uninterrupted time that you’re going to get. So pretend you’re in an elevator with that person, and you have, from the time you get from that first floor to the time they get off exit. So you need to keep it clear that they can get the message without having to present a long story with your information and keep it relevant. If you know what the decision is, don’t add anything that’s not relevant to that decision making process.
And lastly, keep it simple. If it takes up a half a page, don’t try to fill up the entire page just to make it look pretty. Just keep it simple so that the decision can be easily made. The next thing I want to go over is the importance of measurable data or baseline data. Basically, what gets measured, gets managed. So when you’re doing a project, if you don’t measure anything, you can’t judge your progress towards that project. So that’s very important to do that before you start the project so that you can measure it at the end of the project.
Also, you need to set realistic goals for yourself so that you can measure against them, because if your goals are not realistic and you’re not achieving those goals along the way, you’re going to lose the respect and the trust of your client. It also helps to maintain accountability, to make sure that the person implementing the processes or the project is accomplishing what they’re supposed to be throughout the project. It also informs and motivates stakeholders. A lot of times the stakeholders aren’t down on in the nitty gritty of a program or an application where the data you’re is, but if you point out a measurable piece of data or a baseline data, they’re going to start paying attention to that as the project gets rolled out, and they’ll make sure that gets resolved in the project.
It also provides justification for policies. And basically it shapes your expectations and communication strategies with the users. When presenting metrics or information, visualization is very important. So what you need to do is think of what it is you’re trying to accomplish. And this first example, it’s data only. This might be good if you’re trying to find out who sold the most products. And if there’s a few pieces of information, this one only has four and it’s sorted in order, that would be okay for that. But if you had a lot of data and it wasn’t sorted, it would be better to do a bar chart so that you can visually look at it to see which one has the most.
But again, this is only a snapshot in time, and you don’t know how it got to the 15 or the 12, the eight or the seven. You just know at that particular time, that’s what it was. If you wanted to know more about how it got to that point, you would want to present it with trend data. That way you can see all the ups and downs. I know if you have stock and you’re looking to retire, you’re going to want to see which ones have the ups and downs because if you’re close to retire and you don’t want one that’s going to have all these spikes, you’re going to want something that stays pretty closed on a straight line to make sure that you don’t lose your money.
Another thing you want to do is the first trend data doesn’t show you where you are with, pertaining to your goal. So if you put a goal line on there, you can see which ones are performing above the goal and which ones are not. So it just depends on what information it is you’re trying to look at, what decision you’re trying to make. There’s another project that I did where there’s a methodology that you use called Agile, where you divide your work into two weeks prints. And in that two weeks, you have a listing of backlog items that you’re going to accomplish in that two weeks.
Under each one of those backlog items, you have a test case so that when you’re finished with that item, you can test it to make sure it’s working. And then you also log bugs, if they have bugs. With this, my goal was every day in this two weeks, you have like 15 minute meeting. You’re supposed to keep it to 15 minutes to see where your progress is. But if you have a lot of users, out of developers and testers that are in it, getting through a meeting in 15 minutes is pretty difficult. And to try to drill down into each one of these items to see, take this data and turn it into information was difficult.
So, what I did for my visualization, I created this dashboard at the top and the first item on the left was the total number of work items I had. The second one was if I created a backlog item that didn’t have the test case finished and it wasn’t approved to be worked on yet, it would be backlog. Once it had the test case and was approved to work, it would be ready for development, and when the developer would pick it up, they would change it to committed, which would make it go in the development column. Once they were finished, they would assign it to a tester and the tester would start testing. They logged a bug, it would show up there.
And then eventually at the end, what you want is the total to be done on the left hand, on the right-hand side. So what I did is doing this. So the first part of the two weeks, if you have things on the left hand side, you’re not really worried, it’s looking great, everything’s making progress. But as you get closer to the end of the two weeks, you don’t want to see anything on the left hand side that hasn’t been developed yet, or hasn’t been worked on. You really want more things to be toward the testing phase at the end of the two weeks.
So, this helped me accomplish my goal by being able to look at this and drill down only to the parts that I needed. Some of the most common business measures or revenue, cost, client satisfaction, employee satisfaction, market share, and efficiency. These are just a few. And for things like client satisfaction and employee satisfaction, you might need to do surveys to get the reaction and the basis there. In order to turn data into information, data governance is key. There’s a lot of work that goes into making the data useful and consistent for the results at the end. And you’ll see that in our demos when I get to those.
So, the first thing that you need to have is ownership from the top. It takes a lot of work and a lot of hours. So what you need to do is have a stakeholder at the top that says, “We want to do this. We want our data consistent. We want to make our field lens the same. We want to name all the fields the same thing. We want to key it in exact same manner and have consistency.” So if you have ownership from the top, it’s going to go a long way for you. The next thing you have to do is have the correct people and process. Because you need to have people that know why the processes are in place, so that if you recommend the change, they’ll know if you can or can’t make that change. And you need to have people that understand the data.
For instance, I was doing a project where I pulled data from a system, and I was putting in into another one and we were rewriting the system. It had a district that didn’t have a valid number on it. So at first we thought it was invalid data, but as I kept digging and finally got the right people in the room, what it was, there was a 99 code that meant statewide. So it was valid data and it wasn’t until I had the right people that I understood that data. And then in the new process, we put a check box to say that it was statewide so that they wouldn’t have to know, oh, I need to pick district 99 if I want a statewide. So again, presentation is key.
You also want to establish the metrics, as we said, but metrics, they might not always have the data available that you need. So first you need to see what your goal is, see what metrics you need to have, and it’s not always going to be available. So you need to make adjustments to how you’re going to hand if the data is not readily available. If it’s in another system, the preference is to try to pull it from that system so that you have a single source of the truth. Once you establish that you have to have the technology to be able to get to the point where you need to be able to make the field lengths the same, to be able to make the fields name the same.
So, you have to have the technology to be able to pull the data from the other systems to have that single source. And once you get everything done, you want to continue to do quality checks, because once you do the quality checks, you might see that where you thought data would be filled out on a certain screen. It’s not being filled out. It could be multiple reasons. It could be the person wasn’t trained properly, or it could be that you thought the data was going to be available at that point but it really isn’t. So the quality checks help you improve your processes along the way.
The next thing we’re going to talk about is change management. Now change management is the people side, getting the people on board with the change that needs to be made. The model that we like to use is the ADKAR Model. The ADKAR Model goes through different phases of awareness, desire, knowledge, ability, and reinforcement. For awareness, you first have to know that there is something that needs to be changed, and then you have to make people aware that there is a change that needs to be made. Then you have to instill a desire in the users to make that change. They need to know most of the time what’s in it for me.
Then you have to have the knowledge. You can teach the employees how to do the process. It might be a new program, or it might be a change in the process, but you need to teach them how to make this change. And then you need to have the ability. If it’s a new software, you need to have the money to be able to do it or the technology. If you need a new server, or if you need training, you have to have the people and the equipment to be able to make the change. And then lastly, you need reinforcement. Once you make the change, you want to make sure that people stay and continue with that change. And don’t revert back to their old methods.
And finally process improvement, it’s an ongoing cycle. Technology change rapidly. You have new data, it becomes available. You might acquire a new company. All of these things have to be taken into account and improve your processes on a regular basis. The first thing you should do is inventory your existing business processes. Once you do that, you can identify opportunities for change. But at that point you need to analyze your benefits and your ROI of potential improvement. Sometimes an improvement might be nice, but it might be too costly or it might be, take too much time, more time than what you need. So you need to analyze that part as well.
The biggest thing to do is to develop an as-is workflow in a to-be workflow. If you develop the as-is workflow where you show which departments data runs back and forth through, or which processes it runs through, then you can identify where you can make improvements and have your to-be flow. That is useful, if you’re going to select a new vendor, you can use that as your basis for selecting a new vendor, and also use these to your vendor discovery. You can also use these once you do, select a vendor, or if you work in a project in-house to be able to do your guide for creating test scripts. Once you do that, you can use this whole flow and diagram for future changes. And so the cycle continues that you’re constantly improving your process.
Now, what I would like to do is I’m going to show you our counterpart that access sciences have a contract with DEQ, and they were recently asked to make their electronic document management system more Google-like, more friendly. So they use some of these processes that we just talked about to get from their existing system to their new system. So I’m going to show you that, and then I’m going to turn it over to Brad to show you some improvements that DOTD has made. So at DEQ, they had an application. They have a lot of documents that are public facing, a lot of permits. So they have people from other states, other countries that are constantly looking at their data.
So, the current state that they did have it was outdated, had an outdated look and feel. It had a steep learning curve, limited search capabilities, limited functionality and limited customization. So they used the ADKAR Model for their change roadmap to get to their future state, which was a modernized look and feel, easy to use intuitive screens, expanded search capabilities, enhanced functionality, and new customization option. So to do this, first they knew there was a change, they were presented with a change that needed to be made and then they did opinions surveys to the users to get their feedback, to see if there was anything additional that needed to be changed other than what was initially brought up.
After they did that, they wanted people to want the change. So they identified some change champions that had consistent messaging to the users to keep them informed and to get them on board and excited about making a change. Then they introduced new sneak preview videos to show them what was coming, the job aids, and videos and tutorials, and got the users involved in the testing. Then they needed the ability, they leveraged the users to embrace the new system and they had timely response to any questions and answers they needed to respond to. And after that, they made use of the new system and they redirected the URL to the new system so that the old system could not be resorted back to.
To do this, they also had metrics. They used Google analytics to get real time usage data, reports and analysis, and prepare charts and graphs. For the usage, some of the things that they looked at were a number of users, number of sessions, number of page views. They also looked at the location of the users that were coming in, what country, what state and what city. And they could also look at the feature use for what landing pages people were going to and what search screens were being used. This particular screen showed the classic system, which was their old system and their modern system. In the first section, it was during their expanded testing.
You could see that there was more use on the classic because no one was aware, not everyone was aware of the new system and then there was a little bit of use on the modern. Then when they switched the classic link provided to everybody, you still had a lot of users still using the classic. And the modern was being used a little bit more. When they redirected people to the modern, they got more use on the modern and then when they finally retired it, there was no more use on the classic. We talked about making data consistent in the data governance model. One of the things that they had to work for is all of these different items in the advanced search, they had to come work with all of the groups and make sure that the document types were consistent. The subtypes were consistent.
So that when they searched for a document, they would get everything. Because if people named things differently, they’re not going to get a good search. So they worked really hard on getting that part consistent. And then what they ended up with is a search screen where everyone could come here and be really confident in what they were searching for. So now I want to go in and actually show you the system and how they use their data. So this is the front screen for their EDMS system. And what I’m going to do is I’m going to just go to the advanced search. This is the data, you can see there are over 7 million documents, and this is the data that they’ve worked on to clean up.
As I type in an agency interest, you can see the number of documents found changes to what you’re going to be showing in the results set. They can come down and narrow this down and see how many documents they’re going to get. One of the really neat things that they did when you search on this, it shows the filters and you can come down and you can see how many documents are accident prevention. How many are air quality for media? You can also scroll down and do the function. As you select these, it will narrow down your results set. So if you’re coming in from another state or another country, and you’re not really that familiar with the data, you can start with one thing, pull open the filters, narrow it down until you can get what you need, that you look for.
So now you have the documents and at the top there are a lot of things that you can do. You can put it in list view, you can put it in different views. You can also copy the URL. So for instance, if I select this document and I want someone to see this document, I can come over here to this clipboard. It’s going to give me the URL, I can then copy that and send that in an email to someone else to look at. You could export the metadata to Excel, which would also have a link to the system. But down here, you can actually click on the document and view the properties. And it’s going to show you all of the properties for this document in one location. You can also open up the documents in here.
And when you see the document, there are some things that you can do. First of all, you can look at the document and you have the different attributes over here on the right hand side. But one of the neat things they did as far as using their data, you can report a document error. So if you click on this form, there’s a form that comes up. And if you look at the document and you say, “Okay, I see something that really needs redaction.” You can click on this hyperlink right here, this radio button, and in the comments you can type, why you think it needs to be redacted? Or if you think it’s a duplicate of another document, you can actually get the hyperlinks from what I just showed you on copy in the hyperlinks, say it’s a duplicate, and in the comments put the two documents that you think are duplicates.
If you think the document type is wrong, just come in here and you would change it. Now, this doesn’t change it automatically. What happens is when you finish filling this information out, you click the submit report, and this goes to the team, my counterparts that access sciences, that work at DEQ, and they have a process when they review this data to decide if they can actually make the change. If they do, they don’t have to rekey it, they just submit it and it makes the change, or if they need to do redaction, they do that as well. They also have different levels. There’s a couple of users that they have that they let the change go through automatically.
So, there are different levels of security, and some of these documents are also very secure documents. So it will come up with a message that you have to request permission to see those. So this is just one of the ways that they have used it and turned their data into information, because not only can you pull it up and see it, you can also report errors without having to know who to go to, you just click the button and send it in. So now I’d like to turn this over to Brad and he’s going to show you some of the things that have been done DOTD.
Yep. Thanks, Keli. Let me share my screen. Okay, I’m assuming my screen is being shared properly.
Brad, I don’t see anything yet. If you’d like to try again.
Okay. I can do that. How’s that?
There we go.
Waiting for it to load. I see your slides are perfect.
Great. Let me do that, okay. So as I said before, I’m the Director of Enterprise Support Services here DOTD. One of the things that I manage, one of the functions that I manage for the department is the records management program that Keli and her team helped me out tremendously with. Another part is, I’m the primary liaison to the centralized IT service provider for the state, the Louisiana Office of Technology Services. And like most large immature organizations, DOTD found itself in a situation where authoritative business data was being generated and stored in various disparate business systems.
And as time goes on that gets more and more complicated to deal with, right? So over the years, DOTD reviewed several potential solutions to address this issue and with strong support of DOTD executives, we ultimately decided on every system of engagement solution to kind of attack this problem. So with an overall goal of providing DOTD with a centralized enterprise destination, where folks can come and consume, analyze and contribute to enterprise data, we actually set out after we reviewed several solutions, we looked at some discovery with the Esri, the GIS folks in the spring of 2018, that comprised of two kind of tracks to kind of talk to business units, a business track, and then a technical track.
So, we were looking for concepts to kind of say, “Hey business unit, what would make your job easier? Where is that data? And then look at the technical way we could use the SOE, the infrastructure that we’re going to build out with the SOE to kind of provide that.” So one of our goals was also to enhance and leverage some work we’d done on our enterprise GIS infrastructure and that fit nicely with the Esri model, because we want to maximize the use of technology that we already owned. So again, with strong support from DOTD executives. We developed a strategy to leverage location as the key to connect all these different business systems and then we actually… The implementation portion of the SOE kicked off in the fall of 2018.
So, I think this is a good snapshot of kind of the model that we use for the SOE. Essentially, you want to leverage your existing business systems. So they stay in place. They’re the authoritative data source to create reusable services, to access data from these business systems, then create an app, our suite of apps utilizing one or more of these services to provide a single destination, to find, share apps and content. So this is really something to where a system before could have been accessed and managed by very few people who knew exactly how to get to the data, but they’re the only folks that know how to get to that data. So getting to that data and then turn that data into information to make data-driven decision was really a difficult thing for us to talk about.
So many groups from within DOTD and actually across several other organizational partners, we partner together to implement and execute the efficient SOE app development process, DOTD, Esri, and the also technology services, all partner to ensure that the appropriate infrastructure is in place to serve the SOE apps. So Keli talked a little bit about the agile method, that’s kind of what we used here, where we kind of identified an app idea, a concept. We identified the folks that should be part of that app team. We did some data explorations say, where would the data to drive this app come from? And if we know we can get to that data, we would actually go through an iterative process of building out that application throughout to production.
And again, I talked about the infrastructure before. This is just a snapshot of a place in time of how the servers and the infrastructure’s in place to kind of support tying all these things together. And then of course the main portion is the data integration processes. So we actually use FME server to help us kind of tie all these things together, all these disparate data sources from these different business systems and then it, they’d culminate into the centralized kind of IT or SOE app model. So through the strong partnerships that I just talked about and the SOE team was actually able to release over 15 apps in the production in the first year of the implementation project.
So that was our goal. And actually our goal was 15 apps, I’m sorry, for the year, for the first year. And we actually were able to produce 15 in the first cycle of that application project. And that first cycle took about six months. So within half the time we had our goal there. So you see on my screen here, I’d like to highlight. The application project viewer here is one of the apps that we produced in the first cycle of previous to having this SOE app there. We had an internet link on our internet site or internet site rather to all this data that was there, but it’s all in tabular form. And unless you really knew how to use that data, you couldn’t really take it and make data-driven decisions with it, if you could even find it.
And so, what the SOE app helped to do was take this tubular form, stick it on a map where you can just click on it and then pull up all the same data. So much easier to access. Another aspect that the SOE apps helped us to do in this first cycle was to kind of transform the way we did some of our business processes. Our environmental team actually used to before the SOE app go out into the field, take a paper form, write with a pencil or a pen, their findings out in the field, and then enter it, take that back to the office, that paper form, use Microsoft paint to kind of make it look a little more professional, a little more professional, and then include that into their report.
With the SOE, they were able actually to use mobile apps out in the field to collect the data, real time and create reports on the fly in a much more efficient and professional manner. Considering that environmental is part of just about every project here at DOTD. Those time savings through the use of the suite of apps that we created for them is actually very tremendous. If you extrapolate that out across all the projects, that’s a lot of save time and that’s a lot more accurate data that you’re collecting there. So I talked about the first year, our goal of 15 in a year, we did that in six months. So in the second cycle for the first year, we actually created 15 more apps in the SOE, and here’s a snapshot of some of those.
You’ll see here that the bridge information tool up here is one that I want to kind of take a look at a little closer. In this tool, you can search query export data on bridges. The data’s pulled from multiple authoritative sources. I think up to a dozen different authoritative sources that you’d have to go to prior to this app being in place, one by one to grab this data and it provides basic information for bridge projects, bridge inspections, roadways, pretty much anything you want to know about a bridge at DOTD, you could learn it here. And I’ll try to do a little demo here. Let’s see if I can do that, maybe. Okay, hopefully you can see my screen here. If you go, I’ll just back out here.
So, when you first go into the bridge information app, this, this is what you’d see. A map of Louisiana looks kind of clustered a lot of stuff on it because it maps every bridge in the state of Louisiana. But instead of having to go into a tabular format, you can actually go and say, “Hey, I want to know what the information is about the I-10 bridge in Baton Rouge. What do we have on that?” And so, as you can see here, we have a ton of information from, again, this particular bridge is… This information is coming from nine different data sources such as Assetwise. Well, a multitude of data sources. We can even go to location of our microfilm records on to this project.
So, all that’s right here at your fingertips and you could find it as easy as saying, “Hey, I know where that bridge is. I go to a map, click on it.” And it’s right there at your fingertips. Get back to my slides. Sorry. So again, to date, so we’re about three years into the SOE implementation project about 50 new apps have either been released or actually in the process of being released soon. And so over those years, we’ve had about, well now over 1500 unique SOE portal user accounts created. That means 1500 unique folks have gone to our landing page and used at least one of these apps in the past or even in the future. Some of the most viewed SOE apps are open data.
We have an open data page that actually curates and governs geospatial data that makes up the statewide topographic map of Louisiana that’s used by a lot of folks, both within and outside of DOTD, and a couple other apps. The bridge information viewer is one that has a lot of hits on it, as well as some other apps that we have here. So 50 apps and over three years, we’ve done a lot of work, a lot of good work. We’ve opened up access to a lot of data, but that’s just the start, right? So it’s never going to stop where I was going to be able to enhance or even create new apps that come in and use new data sources. This is a living system, right. It’s going to continue to grow and change, right.
So, there’ll be some apps that maybe we don’t need in the future that will retire and even replace with other things. But it’s a great first step. It’s given us tremendous value so far, and we’re really proud of it and pleased with it. And really the bottom line is it’s helping DOTD to achieve our mission. That’s delivering a safe and reliable infrastructure system, enhanced mobility, economic opportunity, and the public confidence. So, I think that’s all I have here, and I can give it back to someone who can take it, I guess.
And you can go ahead and leave that up if you like, or-
…we can just look at our faces, that’s fine. Keli, do you have anything else to add? We don’t currently have any questions, but I’d like to invite attendees to type them in the Q and A there, we can address any questions you have.
Yeah. I don’t have anything else.
Okay. We do have some comments on the chat. At least I have got some Data Heads in the crowd. They said the data’s pretty impressive at looking good.
We do have about 10 minutes before time. So I’d like to just wait and see if we have any questions. Please don’t hesitate to ask. You can ask a question anonymously if you’d like. Okay, we have some questions that just came in. First question, who creates all these wonderful apps, which department?
Yeah. So it’s actually a combination of, we have a GIS team here that’s augmented with some contractors that help us. We also have a team that works directly for me. It’s a team of business analysts that go out and work with the business units. And then we have the DOTD business units that all collaborate to come up with these apps. You have a technical team, a business team and then the business analyst. So it’s really the SOE office. It’s not really an office right now, but it’s a collaboration of those folks.
Great. We have a question in the chat. I’m wondering how you integrated all those disparate data sources in your apps.
Yep. So that’s why we chose a location because everything is somewhere, right? So we chose location as kind of our common key for lack of a more modern word across all those things to kind of link them all together. And in creating those services that I talked about earlier, you can connect to those disparate business sources, use location as a common key and bring them together that way. That’s a much high level thing of describing something as very detailed. If you saw that the FME workbench that I kind of showed with the data linked all together, that’s for like one piece of one app, but it works in the back end.
Well, related to bringing all those apps together. We have another question. How much work was needed to do data cleanup. How did you get buy-in for the time to conduct that work?
Yeah. So I’ll take those in reverse. So I guess the time you’re saying is for data cleanup, we didn’t do a whole lot of data cleanup before, because what we wanted to do is access these apps and expose it, right? So once it’s exposed and a lot of these apps are internal. So it’s not something that’s out in the public very much, but if you expose it and see data inconsistencies or incorrect data there, you can fix it at that point.
That’s one of the secondary, I guess, purposes of the SOEs, to expose that data, make sure it’s accurate, and then when it’s not accurate to fix it. And how do you find time to carve it out? We had a very strong support from our executive staff, meaning that it was kind of dictated that we shall do this and worked through this and we did, and it worked out well.
Great. We have one question left right now unless anyone else wants to ask anything. The question is, did you have any challenges with people accepting the new system with the way of doing things?
Yes, but it’s not a majority of folks. So a lot of folks, I talked a little bit before about some folks who had their disparate business system from their siloed system, and they were the expert in that system. So, folks had to come to them to get to the data in that system, those folks didn’t like this new liberated access to the data as much as other folks did, but on the balance overall, it was really widely accepted and appreciated.
Great. It looks like we don’t have any more questions. I’ll just give it another minute to be sure in case someone’s furiously typing.
Yes, someone needs to ask Keli a question.
Questions for Keli. Questions for Keli. You do have your contact information up the screen here, so people can-
… contact you if they have questions after that. Okay, well, it looks like no more questions at this time. So thank you again, Keli and Brad, I thought that was a great presentation.