What Makes A World Class Innovation Lab? Experian DataLabs and Eric Haller

This episode is sponsored by the CIO Scoreboard

I am excited to share with you my conversation with Eric Haller as this is the first time I have had someone on the show that is in charge of a major innovation lab so our conversation will give you a window of visibility into several areas.

Major Take-Aways From This Episode:

  • Why he believes that “Ruthless Competency Wins”.
  • Innovation is more powerful when linked to a higher purpose.
  • Managing an innovation portfolio and the development of the Intellectual property comes out of these efforts.
  • The organizational structure he reports to.
  • The culture of Experian from the types of people Eric hires as well as the culture of the Experian business as it supports the development efforts of the Innovation Lab.
  • LDA – Machine Learning Techniques.
  • Latent Dirichlet Allocation.
  • Unsupervised and supervised Machine Learning.
    • Unsupervised is based on recognizing patterns.
    • Supervised – modelling known outcomes.

About Eric Haller

Eric Haller is the Executive Vice President of Experian DataLabs. Experian DataLabs is responsible for developing innovative products generated from break-through experimentation leveraging machine learning and data assets from a variety of sources.  He led the creation of labs in the US, UK & Brazil that support research & development initiatives across the Experian enterprise.  New products developed in the labs cover mobile, payments, consumer & commercial credit, fraud, targeted marketing & healthcare.   Prior to Experian, Eric was responsible for new products with Sequoia Capital backed Green Dot where he created and brought to market the first credit card a consumer could purchase off of a j-hook in a retailer.  Eric also co-founded identity fraud detection business iDawg which was later renamed ID Analytics.  ID Analytics was acquired by LifeLock which is now part of Symantec.  Other roles held by Eric includes Chief Marketing Officer of the first publicly traded machine learning company, HNC Software (acquired by FICO) and executive roles with Visa & MasterCard. 

Read Full Transcript

Bill: Welcome you to the show today.

Eric: Thanks, I'm really excited to be here.

Bill: Eric, as I was reading a lot of the articles that have been written about you and Experian and the Innovation Lab, can you give just an overview of what in your mind by definition is an innovation lab?


[00:01:00] Well, you know it's funny because as soon as you asked that I was in an executive meeting, I want to say about three weeks ago. Somebody asked me, these are the guy that run the company and they asked me what are my thoughts on innovation. I said, "You know, it's funny because in our lab we never even say the word innovation." I think it's more in the DNA of what you do. You look at the world and you think pretty much anything could be improved. It's almost a culture of looking at things around you and saying, gosh, this really ticks me off or I wish we could do this, or I don't understand why this can't happen. The putting some, some would say mental muscle behind it to try to figure out how to change the situation.

[00:01:30] I think so far as saying innovation lab, which we're a lab here comprised of mostly scientists and engineers, I think that's exactly what we're doing. We're looking at the world around us, the one that our company Experian plays in and we just see lots of opportunities to do good things with the data that runs through our business, and the data that runs through our clients.

Bill: How did you happen to grow into this position within the organization? What has been your steps to get to the point you're at right now with Experian?


[00:02:30] My background coming to Experian was a bit diverse. I've been working in product development and startups, and marketing for basically my entire career, which is I'm embarrassed to say is more than 20 years, we'll just leave it at that. Actually, I'm a two termer at Experian. I was here in the 90's and worked MA and strategic alliances. I left to startup my own company, which focused on ID fraud, which I sold call ID Analytics. It became part of LifeLock and LifeLock, if you read the press just got bought by Symantec for 2.4 billion dollars.

[00:03:00] I left to do a startup and did some other things, and then I got recruited back in and this time it was to run a product P&L and I managed about a 500 million dollar P&L here at the company. I realized that it was very hard to develop things organically that were swing for the fences type product opportunities. In a rigid P&L environment where you're delivering on metrics, KPI's that you've committed to the managers and your investors, taking risks become a bit more challenging. You have to figure out creative ways on how to squirrel away the right resources to take those risks.


[00:04:00] I felt Experian, we could be doing a lot more. I did have a lot of trust with our executive team based on a lot of the things that we've done in the past, but really I'd never started up a lab from scratch, I'd never managed a team of scientists ever before. I had managed software engineers before but never scientists. I just pitched it internally this notion and told them what I wanted to do, how much money it would cost the company and what they could expect from it over a few years, and they funded it.

I guess you could say I'm lucky that way. I didn't have a resume that said hey, Eric's got a track record of standing up labs. That wasn't it at all. It's the first time I've ever done something like this.

Bill: Does your background as an entrepreneur, does that help?


[00:04:30] I think it helped a lot actually yeah. Once you've been in the business of tackling something, it's an idea you come up with a way to solve for it, you realize how much execution plays a big factor in it. That's the way I personally tackled the lab. I'm not a scientist or engineer myself but if you look at my whiteboard, the first week that we started up the lab we had a team meeting and we talked about what would it take to be successful. We came up with these three words, ruthless competency wins, which is ironic because we're talking about innovation.

[00:05:00] We all believe that the world's not short on ideas. We see problems around us all the time. We see a lot of different ways to solve those problems, but inevitably it comes down to getting the best people against it and executing on it. I think if you come from a startup environment, you totally get it. That's we'll say one of the big pieces of the recipe to success and that's what we've tried to replicate here in the lab.

Who do you have to partner with within your organization? Are you on the edge, on the fringe sort of self funded or do you have to engage with CIO or CTO or other business units?



[00:06:30] One thing that is very helpful, I actually report to our Global Chief Operating Officers. The way Experian is organized, we have four regional Chief Executive Officers. Each regional CEO manages all the people in that region. We have 17,000 people so the person that manages our North America region manages about 6,000 people. All of those CEO's report to our Global Chief Operating Officer and they share the same manager as all the other CEO's. For me, that's made it a bit, let's say easier to get traction because those CEO's control all resources within the region, so when I do run into hiccups or whatever, we can sit down and say hey, is this worth trying to plow a new field for or should we take a different direction.

[00:07:00] Having to peel back the onion, I would say that we had to be in a position to work with a lot a different people across the organization. Business unit presidents, sales teams, product and marketing teams, everything from outsourcing and new ideas, which a lot comes from the market but we also get a lot from our business units to once we've built something we've incubated it and now we need to return it back to the business. We need to be in sync with those product management teams, those development teams and then of course with sales, our sales quotas are prioritized based on the business units. For us to get sales bandwidth, we need to make sure the business units have bought in so they can provision us that space on we'll say, sales bandwidth.

Do you have to interview clients as well? Are you ...



[00:08:30] We do spend a lot of time with clients, yeah that's actually where we probably get the most traction. We've been doing this now for six years. We've got labs now in three countries. We've got one here in North American, one in London, one in Sao Paulo. The notion there is that when the sales team is engaged with clients, they meet with a CMO a CRO, CEO and they start to have that strategic dialogue about where that company is going, what are the major hurdles that they're going to have to tackle and wrestle with to make their numbers for the year. That's typically when they'll pull us in, so we are part of that conversation and we can listen to one of the highest priority strategic objectives. Then come back to the shop and think through it and is that something that we can engage a client with. Can we come back to them with something that they'll want to partner with us on it to take a swing at trying to solve for whatever that issue might be.

[00:09:00] Oh interesting, and is it typically Eric, around data and finding and mining really useful information from data as far as a product that you're looking to develop would be concerned?


[00:09:30] It is, yeah. That's actually why we call them data labs so again we're not, in that respect I don't see too many folks coining the name of their labs data labs. We are all about data so it's not infrequent for a client to pass billions and billions of transactions to us so we can evaluate that. We'll merge it with our data. Experian has gosh, we've sent out a lot of information ourselves that drive the businesses that we operate and we do that all in our own computational environment that is dedicated to our labs.

Bill: What's the most common misconception? You must have people reaching out to you just to even understand how they can do it themselves in their organization. What's the most common question you hear people ask?


Believe it or not, they're typically not technical questions. Maybe three or four years ago, we would get questions like gee, you operate Hadoop cluster. What were the benefits that you saw going from mainframe to Hadoop, or something like that. I would say three or four years ago it was much more common to get a question like that, but now most of the questions are how in the heck are you able to attract the type of talent that you are attracting into your lab, because right now that's probably the single greatest hurdle that most large companies and facing in the data and analytic space is attracting the right talent to put against the opportunities.

[00:11:00] Do you find that it's possible for a company to rent that expertise? Have you thought about coming up with a rental model where people can patch in, or turning into an IBM Watson so to speak, of that capabilities you have?


[00:11:30] It's a cool idea. I don't we would be interested in something like that. Our business model here for the lab is really focused on developing intellectual property, so just hearing the term rent makes me think that intellectual property most likely to be owned by who ever is doing the renting, not the person that's being rented or the company that's being rented. I think that would be the question mark in my mind is how you wrestle through that? When we develop IP the notion is that we're building a product that we can take out across the market.

Bill: Yeah, no that's a really important point, so the data innovation lab is really, one of the primary objectives you're looking for is to develop IP based on these unique capabilities you're building to help your customers.

That's right, so it becomes that funnel for new product development for the rest of the company.

Bill: Is it something that you try to scale, like if you solve it for one client with this unique capability then you're looking to scale that across your portfolio, or is it something you just try to deliver to that one particular customer?


[00:13:00] Yeah, that's actually an excellent question. I certainly wasn't clear on that. We are very selective about what we go after. The reason why is because the end game is developing IP for products that we're going to market. We are looking at the opportunities within that lens. We're looking at two things, primarily the probability that we'll be successful because we are operating on that fringe, and while we might get that adrenaline rush of chasing after somebody that nobody's been able to solve before, we really have to have a line of sight to a finish line. We typically don't take broad discovery type opportunities unless we think that the discovery might yield some treasures along the way.

[00:13:30] We look at that, we also look at the magnitude of impact. We want to see something that's going to be a needle mover. It's nice to hit singles with products but in a company that does close to five billion dollars in revenue, we're making the investments in labs to make sure that we can build things that could fuel top line growth for our business. We talk to a lot of clients, and I'm not going to say that the client "wins the lotto" when we decide to choose something to work on, but it is a bit of that because a lot of times we're doing the investment in developing whatever that product is.


[00:14:30] If we're building something with the client, they don't have to participate in the financial risk of developing something from scratch. We'll take that risk on ourselves. If we're successful when we build something that solves a problem and they want to use it, they can certainly license it. That's good for us, it's good for them but really that's what we mean when we say risk free environment. We talk about that a lot with clients. We'll talk about the fact that our computational environment is completely cutoff from the rest of the world, so you don't have to worry about somebody networking in and going after the data that's been sited. We don't operate with personally identifiable information so when they do actually get to it, there's no associated social security numbers and names, and addresses and things that can be toxic if they're compromised.

[00:15:00] They don't have to worry about being participating in that financial risk, so we're doing those all on our own dime. Then of course, we own the intellectual property that we develop. So far, that model's worked. I think if I hung a shingle out and said hey, you can rent our resources, I probably could charge pretty good rates because I think they'd be like wow, that's great but it wouldn't be good for Experian. I think we get way more out of the investment by generating the IP and building new products than we would renting ourselves out.

[00:15:30] A lot of people listening are going to be more resource constrained. I want you to talk about if you were in a constrained environment, maybe in a company that had 500 or 5,000 employees how you would approach innovation from that angle. Before we get into that, you must though to reduce risk on what you chase after because you're so selective, you must have a portfolio that you gauge risk on like where are going to be your home runs versus your singles. Do you have that visual for the team? How do you determine what's going to be a home run versus a pure single, or do you not look at it that way?


That's so funny, so yeah if you looked at my whiteboard right now, I've got about 16 projects that are stacked in any given month. This is just for our North America lab and we have similar pipelines for Brazil and the UK. Actually this morning, it's funny that you brought that up because this morning we were having a team meeting. I was meeting with the guys that manage delivery and we were just thinking to ourselves, you know when your load balancing heads against these projects, you're all right, where do I want to put our investment. We talked about where we thought the biggest hits might be in the market revenue wise.

[00:17:00] We're pretty fluid here. Believe it or not, with the company where the business folks all have MBA's and the scientists all have PhD's you'd think we'd have a lot of structure around business cases and those types of things. We don't, everything's white-boarded out, conversational. I'd much rather have an hour whiteboard session on a business opportunity than reading a 16 or 20 page business case. I've always had the philosophy that to write the business case, you had to think through a lot of things you already knew what the answer was and you're spending most of your time trying to justify it. I think you get through a lot of that with a pretty serious dialogue. We all believe in that, but yeah, that's exactly what we do.

[00:17:30] We do have a portfolio, we look at the risks associated with it, the potential returns and if we think we're putting time into something and it's looking worse and worse as the weeks go by, like this was a real dog then we shut it down. We go back and we talk with whoever we need to, to do that. Typically, I would say the success rate is actually pretty high. If it makes it to the board and we're actually putting resources against it, probably close to half of the these wind up generating revenue for the company.

Interesting, so is the key to that being in front of the customer? Just to see in measuring what they're trying to solve? How do you ...


[00:18:30] Yeah, you have to be tight with the market. I did manage more on the product side and we'll say the business side, if you're trying to size a market you're going to spend a lot of time with clients. You're going to spend some time maybe with some guys that are researching the market and defining market sizes, but we do spend a lot of time with clients. As far as defining the market, we'll go and communicate with our business unit colleagues when it makes sense, if we need to get some more dimensions around the market. About 20% of the lab are actually people with backgrounds in product development and consulting, so that's a lot of what they do. It's more on the business side of building products.

Do they tend to have a design thinking orientation or is it more ... What would be the skills of someone on that side of the fence versus the data scientist side?

Eric: Well there aren't many.

Bill: Okay.



[00:20:30] Those that we have, I want to say every single one of them went to Graduate school. I'm not sure if they all have MBA's but most of them do. They all have technical undergrads and most of them spend time working in consulting. Maybe some of this is my own personal bias, I like hiring smart consultants who maybe spent some time at McKenzie or Accenture or Oliver Wyman, or BCG because these are folks that have had to engage with clients at a senior executive level. They're used to operating in an environment where they started with a blank sheet of paper and they had to put something together. They're used to thinking critically about how to solve for a problem. In this case, of looking for individuals that have a tech background so when they're working with our scientists they're translating problems to a point where our scientists and engineers can engage to think through how to solve for it. I would say that makes us a bit unique in the company because we have all those pieces together under one roof.

Bill: Do you try to with the data scientists, are they typically versed in Hadoop and Python? Is there a certain type of product set that they come with or they just have more of the analytics side of computer science background at their disposal?


The Python and R and the app they've worked in some kind of map produced type environment, but I would say that to get in as a scientist we screen for their algorithm ability, so we give them tests they have to take. The tests actually will test them on their ability to solve for specific algorithms. It will test them on their coding abilities so we give them what believe to be typical type coding assignments in the lab. When you're working with large datasets, how are you going to write a super efficient indexing piece of code, so we'll give them the challenges that we always see and then they can write their code in whatever language they choose. If they want to write pseudo code that's fine too, although they'd probably get marked down for that.


[00:22:30] Then we score it. We score their coding ability, we score their algorithm ability and that's just to be a data engineer. If they're going for a data scientist role and if they don't have experience, say they're just coming out of school with their PhD, we'll typically give them a block of data. They get three days. They have to clean it, they have to analyse it, they have to come up with their own techniques they want to use, whether it's logistic regression scoring or some type of machine learning approach. Then they have to come back with what they've learned and present it to a room of about a dozen scientists and explain the process that they took it through. Typically on top of that we'll have them present on what they received their PhD in, so we can get a feel for when they are an expert in something, how well do they articulate what they know.


[00:23:30] Then they go through almost a full day of interviews where it's usually two or three on one, with a whiteboard and they're given challenges that we would typically see in the lab and we see how they whiteboard out their solutions. It's pretty grueling. I'd say the last three scientist that we hired, one has their PhD from MIT, another one has their PhD from Yale and another one has their PhD from Princeton. We definitely are pulling in what I'd say ... That's not a requirement that you go to one of those schools. We really don't care, but those are the folks that are making it through the process.

Bill: I'm assuming those three on ones are really looking for teamwork and collaboration. Is there a certain part of this, are you just looking for heroes or are you trying to see if they can participate in the dynamics of group collaboration, or is that not really that important what you are trying to do?


Oh definitely, it's definitely important. What I like to say, we'll say this is at 50,000 feet. My mantra is we don't hire jerks. We have to live with these folks, everybody that we work with we see them all the time. We're in tough meetings where we're wrestling through stuff. There's lots of opinions. We try to be as fact based as possible. We'll have debates over techniques, we'll have debates over all sorts of things but in the end, we all want to be able to hang out and enjoy each other's company. We definitely are looking for humility in the process.


[00:25:30] I would tell you that we have definitely not hired people who have done well throughout the entire process. Done exceptionally well but came with a lot of ego and we just know that, that day in day out actually can be counterproductive. We are looking for, when you say heroes, actually I say superheroes. My take is everybody in this lab has a particular strength where they tend to be able to do whatever that is better than anybody else. We see that a lot, we do see that a lot and we're always looking for that. Once we see somebody that brings, because of what they did their Doctorate work on or whatever, they have a perspective or background or some technical acumen that we're lacking in the lab and we see an opportunity to leverage, that's actually a big plus.


[00:26:00] I was just listening to a podcast by, I forget who, it'll pop into my head but he talked about the concept of a weak link sports, like soccer. This is so true, if your right wingers weak, it doesn't matter if all ten players are strong. If you have one weak link it definitely, the other team can exploit that weakness versus the point in basketball. If you're playing basketball you can really lean on one or two stars but it's very difficult in soccer. He called it a weak link sport so I can see how with the complexity and the amount of skills you need how finding a strong, solid team would really make it much more effective for what you're trying to do.


[00:27:00] Yeah, in this lab the weakness might come in youth, just from lack of experience. We do on occasion hire undergrad, when they are just incredible. I actually hired one person that didn't even graduate from college, but he built his own Hadoop cluster in his garage. It just so happened, his brother had a PhD from Stanford and was working for Google, but he was the rebellious little brother but he was a genius, so we grabbed him. I tend to care less about the educational credentials, it just turns out that we wind up hiring some people with some very impressive credentials.

Bill: One of your blog posts you wrote about breakthrough experimentation and innovation. I was curious, what experience or story have you been most proud of with your team or the end deliverable to a customer that most fits that breakthrough experimentation and innovation quote that I read?


[00:28:00] Well I have a few of them. I'm looking at my whiteboard, I'm like what would I want to talk about? There's a few of them, but maybe I'll talk about a couple of them. There was one bank that was sitting on top of a lot of card transaction data, like 40 million customers. They were you know, we've got all this data but we really don't have a good measurement or insight into how to leverage it. We know a lot of people are leveraging card transaction data, but we've seen all of those solutions. In fact, we actually buy some of them. We still feel like we're leaving a lot of value on the table but we just don't know what it is. They brought this up with us and basically said well if we gave you this data, what could you do with it? Amaze us.


[00:29:00] We came back and we came up with a whole bunch of different ideas. We took what we thought to be the top three and went back to them. I'm going to talk a little bit about that, which they got excited about and wound up saying yes, let's do it. One of them was there's a lot of segmentation frameworks in marketing that try to leverage demographic data or geo-demographic data so that you can use that for targeted marketing. The more information you can use up front to be able to segment out your market, the better response rates you'll likely get when you go to market to those individuals that are in those segments.


[00:30:00] We said, "What if you created spend behavior segments, just based on a card transaction data, so the populations you'd be creating really are 100% reflective of their true, actual card spend." What kind of results might you get in the market if you targeted somebody who only uses their card online? They're a heavy duty online shopper. You see lots of transactions with Amazon or Apple. Maybe the types of transactions are gauging on the online channel. What if you chose to market to them only online. What if you offered them a product where they got discounts or some rebate back whenever they used their card online. What if you chose to email them their offer versus putting it in the mail. You cater your offerings, you cater how you communicate with that customer based on how they use their card.

[00:30:30] We said, "Okay, let's kind of chase that rabbit." We leveraged a topic discovery system called LDA, it's a technique. It's an unsupervised machine learning technique. The same technique Google uses when you do a search. We imagined that the card accounts were just like documents that Google would search. The data within those accounts were just like words like you would use for topic discovery, and search for words within a document. We categorized all that spend into these segments and we replayed that back.

[00:31:00] Now we didn't know what we were going to get. That's the risk. We took this approach, unsupervised machine learning, approached it to literally billions of credit card transactions on tens of millions of accounts not knowing what the results would be. When we dialed it in, we were blown away that the segments became so clear. Once the clusters were created we actually went and looked at the clusters and said, "Wow, we could actually create marketing segments from these."

[00:31:30] We went back to the bank and now the bank uses those segments. We consistently re-cluster all of their accounts every month. They use those segments across all sorts of different ways that they market their products and communicate with their customers. They're having huge success in how people are responding and using their card. That's an interesting breakthrough. I want to talk about the other things but it would just suck up all the time, me talking about projects.


[00:32:00] Well no, no I think it's important for people to hear the story because I think otherwise it just becomes theoretical. I think it makes it come alive for myself as well. What's interesting is the machine learning, when you say unsupervised, are you saying that you pointed the algorithm at the set of data? Did you give it instructions like look for this or look for that, or did it have to try to build pattern that it found? Was it just completely unsupervised or did you give it any sort of head north, head south?


This is unsupervised. The difference, to some of your listeners who might be a little lost, I'll just try to keep it super clear. The difference between unsupervised and supervised, supervised is you have some known outcomes that you're trying to model for or look for. If you think of hey, I've got a bunch of people that have an illness like cancer. I know what cancer looks like, I know the symptoms, I know what symptoms they had before they were diagnosed with cancer. Now I've got a hundred million people and I'm going to try to be really smart about trying to figure out who might have cancer and they don't know it yet. I have a known outcome that I'm actually modeling for, and I've got something that I can feed my process so it knows what that looks like.

Bill: It makes sense.

In unsupervised, you're really doing more pattern recognition. You're looking at a lot of information, you don't really know what you're looking for but you'll know when an anomaly creeps out. You'll know when hey, I've never seen fraud before.

[00:34:00] I'll give you a perfect example. We had a large bank say, "We've got all these fraud detection systems for our card transactions. When we see fraud, we feed our systems what fraud looks like and it is great at finding things that are similar. In that environment, it's not so good at finding something that no one's ever seen before."

Bill: Sure.

Eric: For unsupervised, it's a perfect opportunity. It's saying you know what, most of the time this looks pretty normal but this spike comes up and I've never seen that before. For clustering for topic discovery, things where you're just looking to map for patterns, it's really good for that. Maybe that's the best way to distinction.


Map for patterns [crosstalk 00:34:31]. Yeah, I think it's interesting. I know the IT security world has to go that way as well, because there's so many zero days issues that the systems have to start to learn, they absolutely have to start to learn for the unknown, for the pattern. Something that breaks the pattern. It's interesting how you're using a similar technique or having success on that example you gave me, because the IT security has to go the same way as well.


[00:35:30] Yeah, network security is actually a big project we're just starting to engage in. If you think about for Experian, we've got data centers all over the world. Our clients have the same issues that we do in terms of protecting against attacks. We've got all this data running through our systems on attacks and how that's coming in, and we said gee, why don't we do this. Why don't we leverage our own data and see what we can do in terms of developing products or services that can not only detect intrusion or potential intrusion as it's occurring, but predict where it may come from next. We're actively working on that project using Experian as our internal client. Then as we get knowledge and some success, we can turn around and go to market.


Yeah, one of the biggest issues is, and I was going to ask you this, STIX and TAXII are using federal government data sources that don't talk to commercial enterprises. The data that the public sector has, FBI and what have you or DHS, they're not sharing that with the commercial sector but it's there. It's interesting, if that data could be mapped then all of a sudden you could see what our nation's defense can see and then marry that, suck and ingest that data into our commercial tools. Then all of a sudden you have almost a biological model where we can respond, the body doesn't have to think, oh I've got a threat in my big toe, it's infected, it just deploys. It's interesting your team could potentially deploy against that really tough challenge, which is unstructured data flowing from public, private and figure out what the pattern is so that we can respond to security events.

That sounds super interesting. That's exactly the kinds of stuff we would like work on.

Bill: Well I know we didn't start this conversation to talk about that, but the parallels are totally related.

Eric: Oh yeah, that's right. I'll have to followup on that one, it sounds good.

[00:37:30] Well listen, I really, really appreciate you coming on and educating myself and audience about what Experian's capabilities are, what you've developed with the data labs. I think you've definitely gotten my mind turning a bit and I'm sure you did for our listeners as well. Is there anything that you were just dying to share with my listeners that we didn't get to that we could wrap up with?



[00:39:00] Thanks for that and thanks for the opportunity to just sit and talk for a little while. Yes, actually, this is a little bit of a shameless plug. If any of your listeners are data scientists, which I don't know how many are but we are always looking for those best and brightest out there. If I said anything that maybe spurred some interest, I would also say there's a certain level of purpose in what we do. Experian operates in healthcare and auto, and in credit so if you're interested in bringing credit to those who have no credit footprint in the market, stimulating the economy for those businesses that don't have a voice, helping hospitals with identifying patients that carry the most risk into being readmitted. Trying to figure out how to predict that and then what treatment strategies might work best and followup. These are all things that we consider to be really good, noble type projects but can have tremendous impact in the market. If there's anyone out there, I definitely would love to see their resumes coming in.

Bill: That's fantastic. I know we spent some time on the types of people you're looking for. I think for sure, we'll put in the show notes a link. How would you prefer people talk to you, through Twitter, LinkedIn? What's your preferred mode, email?

Eric: Oh thanks, yeah that's great, either is good. We'll make sure we followup and give you both.

Sure, absolutely and I think it's interesting you mentioned purpose because it's funny. I think from a leadership point of view it's funny that the teams that you have working together. It's not just scientists, you have very smart business analysts and you have people really thinking about the higher purpose around the data.

[00:40:00] I was just in a conversation the other day, it was actually a university professor. He was really concerned that his people aren't wanting to study anything other than programming. I objected to it because I really believe that to solve some of the world's biggest challenges, we're going to really need to look at the scope of the problems. What we're willing to go after and marshal teams of both really important data scientists and programmers, but also people that are interfacing with clients and customers, and seeing what problems do we have that we can deploy against.

[00:40:30] Yeah, you're right on the money, and that's how we've designed the lab. That's exactly it. We try to pull together all the pieces under one roof. It makes a big difference.

Bill: Well Eric, I appreciate you for your time today. Thank you very, very much and we'll link up everything on the show notes so people can connect with you if they desire.

Eric: Thank you very much Bill, I appreciate your time.

Bill: Absolutely, take care. Have a great day.

Eric: You too, bye, bye.

How to get in touch with Eric Haller:

Key Resources:

This episode is sponsored by the CIO Scoreboard, a powerful tool that helps you communicate the status of your IT Security program visually in just a few minutes.

* Outro music provided by Ben’s Sound

Other Ways To Listen to the Podcast
iTunes | Libsyn | Soundcloud | RSS | LinkedIn

Leave a Review
If you enjoyed this episode, then please consider leaving an iTunes review here

Click here for instructions on how to leave an iTunes review if you’re doing this for the first time.

About Bill Murphy
Bill Murphy is a world renowned IT Security Expert dedicated to your success as an IT business leader. Follow Bill on LinkedIn and Twitter.