Instrument mastering is increasingly important in an asset-driven world and there are big risks for those who do not get it right. To help you understand the ins and outs of what instrument mastering is and how it can help businesses achieve better outcomes, we’ve unpacked it in our latest AlphaChat.
Transcript:
James Milne:
Hi, everybody. Welcome to another episode of AlphaChat. My name is James Milne, I’m the Head of Product here at AlphaCert and I am here to talk about instrument mastering with Stephen Huppert, an industry expert in the investment management industry. Welcome, Stephen.
Stephen Huppert:
Thanks James. A good place to start would be asking what is instrument mastering?
James Milne:
Instrument mastering is gathering together all of the data to provide you with that golden record for an instrument. So as everybody knows in the investment management industry, we have many different players in the ecosystem. Your custodian banks, your market data providers, your internal sources, all of those are giving you data based on instruments primarily, and all of the valuation transaction data that goes around those instruments. So it’s really important to be able to tie together all of the data that comes from those different sources.
And when we talk about an instrument master, we talk about all of the reference data that surrounds the instrument. And that can be the identifiers, which can be all of the market identifiers, as well as things like custodial identifiers or proprietary identifiers from market data providers. You use those to match and consolidate instruments together so that you don’t get any duplication. And it keeps that instrument master record clean from the single instrument master, but also keeps your entire dataset clean in terms of not creating additional instruments within your dataset.
Stephen Huppert:
And when we’re talking about the instruments in the context of investment management, we’re really talking about the securities, the equities, the funds, all the different things that we’re invested in.
James Milne:
Correct. And from a data perspective, if we look across asset classes when we’re talking instrument master, an instrument can be anything from a cash account or an FX trade or transaction. If you’re keeping your instruments down at that level, right through your vanilla types of instruments such as your fixed income instruments such as bonds or your equities, ordinary shares, preferred stocks, those types of securities right through to your derivatives. So your options, swaps, futures, and all of the additional data that comes around all the complexities with each one of those different datasets and asset classes.
Stephen Huppert:
And I imagine knowing the super funds here in Australia, as they’re getting bigger and bigger and their investments are getting more sophisticated, the broad range of instruments that are investing in the data associated with those is going to become quite overwhelming for the organisations to manage.
James Milne:
It becomes large the more types of instruments that you invest in and it becomes more complex. So as you get into the likes of derivatives, the number of instrument attributes that you have to keep and now also report on for compliance purposes and the like, are growing all the time.
So it’s really important to; A – be able to tie all that data together in a consistent and accurate manner on the way into any system that you’re using; and B – be able to get it out in a consistent manner and report on that data as well and make use of it through the organisation.
Stephen Huppert:
So it sounds like any sort of investment manager, fund manager, to get the best value out of the data that they have, to have that consistency across the different silos, internal, external data, different types of instruments, to have that standard view, and I think the term you used was golden view or golden record, is imperative for actually having data as a valuable asset for that organisation.
James Milne:
Yes. And when you talk about having data as a valuable asset to use the instrument data effectively, you really want to focus in on making it efficient, making it automated and mastering it once, creating that golden record and then using it everywhere so that you can make sure that everybody within the organisation is using the same dataset. There’s no contention around different instruments being called different things by different teams. Master it once and use it everywhere, and that’s going to give you the most bang for your buck, if you like, the most efficiency, the most automation.
Stephen Huppert:
Even having that common language around the data attributes, we shouldn’t take that for granted either, should we?
James Milne:
Exactly. A common language gives you consistency around how different teams are using that data. And when we talk about an instrument master, it can also be down to different ways of classifying that data. So you may have a single instrument which is viewed in different ways by different teams, but that all becomes part of the master record. So you may have a specific instrument which is classified one way by a fixed income team or another way by an equities team. For instance, a region attribute, potentially.
Fixed income may see Japan as a region within a region on its own, whereas an equity team may see Japan as a region or a country which sits within the APAC region. So those types of data scenarios can exist within an instrument master. And a good way is to have that flexibility on an instrument master, to be able to roll it up in different ways, to be able to report on it in different ways, but it’s still part of that single golden record. You’re not creating two cuts of the same data point.
Everybody knows that when you’re talking about it in one context, these are the attributes that belong to that instrument. And another context, it’s the same 90% attributes, but maybe some classifications are different.
Stephen Huppert:
And it’s normal having internal teams using different language and using the data differently. I’d imagine when data comes into the organisation, external providers will have their own terminology. And so a big part of it is doing that translation so we’ve got that consistent labelling internally.
James Milne:
Exactly. And that’s what we spend a lot of time doing at AlphaCert is, if you like, getting that consistency during the inbound process when we’re taking in data from the custodian bank, combining that with market data providers, uploading some internal sources as well, making sure that everything is labelled consistently. We’ve got consistent instrument types and subtypes. We’ve got classifications that are consistent across the organisation. And then moving on from that, you can also see the data lineage that things have gone through in terms of those transformations to go from one classification at source, if you like, through and mapped into a system like AlphaCert through business rules or logic that’s been applied.
Stephen Huppert:
And it sounds like as well as having to work with the external providers to understand the data coming in, you want to also work with the business to understand the various uses of the data to make sure that the instrument master is flexible enough and can work with the organisation.
James Milne:
Yeah, and that’s key. I mean, the business really have to be involved in driving from the outset, what do you consider as your internal fund or house view for all of the instruments? So how do you type those instruments? What do you refer to them as in terms of equities, stocks, bonds, all the derivatives? What are your naming conventions around those internally? And then what do you need to report on or what do you need to comply with in terms of the regulations? How do they need to be expressed externally?
And getting that clear upfront and then all of the attributes that exist underneath each of those instrument types and subtypes need to be managed and updated, or the effective path of updates needs to be put in around a source of truth from different suppliers. Once you’ve got the business to identify all of that, then that’s where we can put all of that in place.
But it’s really driven by the business and needs to be driven by the business in terms of what they need internally and what they need to send out to the market and send out to the regulators for compliance purposes externally as well.
Stephen Huppert:
And so when you’re doing work with a client on developing the instrument master, what are some of the major barriers or challenges that you have to deal with?
James Milne:
I think it always comes back to identifiers as one of the key things. And so what we’ve had to do is bring in some identifying matching ability. So we can say, “Well first of all, we’re going to check some internal identifiers, which might be from the custodian. Then we’re going to go through some market identifiers such as your ISINs or CUSIPs.
Provider A is going to give us two out of the six identifiers. Provider B is going to give us the other two. We’re going to have to match on a common identifier, which might be a ticker or an ISIN or something like that. So I think the identifying matching is certainly the key challenge, but it’s all about putting those things in place so that you have a clean dataset and you’re not consistently creating duplicate securities because they’ve come in with slightly different identifiers from different providers and the like.
So keeping that golden record not just for one security or instrument clean, but also across your entire instrument set, keeping that clean so you’re not creating dupes or dirty data in there as well.
Stephen Huppert:
Well it’s clear that there are some very obvious benefits of having that instrument data master because you wouldn’t have to go through that on a regular basis. What are some of the other benefits to an organisation of having that good solid cohesive investment instrument data master?
James Milne:
I think it comes back to what I mentioned earlier, just around efficiency and automation. Once you’ve got everybody agreeing to what the consistent set is and you can move forward and start to use that set across various use cases within the business, everybody needs instrument data and everybody needs clean instrument data. So if you can agree upfront, this is our set of instruments, this is how these are put together, these are our sources in terms of their priority, primary, secondary, these are the rules in terms of which inputs can update securities and can set attributes, these are the attributes that can be locked by the business if we don’t want them overwritten. So all of those rules can be pulled into an instrument master to give you that ability to keep it nice and tight and nice and locked down and everybody knows exactly what it looks like and the rules associated with it.
And once you’ve got that, then you can say, “Okay, well this is our set. Who wants it?” And you can go out to the various parts of the business in terms of the reporting side of things. We’ve also got with one of our customers on the investment banking side, they need to be aware of anything that could possibly be traded in the market. So their use case is to bring in all of the data-on specific exchanges from an instrument perspective so that anything that could be traded potentially by a customer is held within that instrument dataset.
So the various parts of the business, whether your buy side or the investment banking side can all use that instrument master.
Stephen Huppert:
And it sounds like there’ll be some real advantages from a compliance and risk management perspective as well.
James Milne:
I think so. Especially if you can show all those things like data lineage, what is your process around creation of securities, updating securities, getting data out from a system such as AlphaCert into the various other systems which consume that instrument master data. If you can demonstrate and show all of that within a system and a user interface and a process, it certainly makes it a lot easier on the regulation side to show that you’re complying and you’re doing everything you should.
But then also from a risk perspective as well, you can show your internal risk teams that we have robust controls around this instrument mastering, we can identify down to a very granular level where data attributes have come from, who made changes to an instrument, at what time. And we can show exactly that from a compliance and lineage perspective.
Stephen Huppert:
And that lineage and the auditability of that would give a lot of comfort to the risk and compliance folk.
James Milne:
Absolutely. When you can pinpoint down to the minute on the day that a data attribute changed from X to Y and who made that change, whether it was a user or an internal file that came in and updated that data point gives it a lot more visibility and therefore a lot more, as you say, auditability and security to any of those changes.
And what we’re talking about really is all of these things that are nigh on impossible in a spreadsheet environment when you’ve got data that’s coming in and flowing from various sources, trying to tie that data together to start with, but then also maintain audit and control around that data, really difficult when you’ve got manual processes and manual things like the spreadsheet environment.
Stephen Huppert:
And one of the things I certainly see with some of the super funds that I deal with here in Australia, a lot of effort’s been trying to reconcile different reports on different spreadsheets and try to get agreement between different teams or different parts of the business because of that reliance on spreadsheets, which lack that control and that auditability.
James Milne:
And once you’ve got that, it really speaks back to the efficiency gains that you might get. So those are some of those other benefits that you would see that people are actually spending their time on data analysis rather than trying to tie data together and make heads or tails of what’s coming in with different providers.
If you do that via an automated process and bring all that into a single central source master at once and use it, you’re going to see gains. And you’re going to see your users actually… Your users of that data are probably happy that they’ve got a consistent data source. And also the people that are responsible for that data gaining a bit of time back in their working day because they’re not wrangling spreadsheets together.
Stephen Huppert:
Sounds like there is enormous value to having that central enterprise-wide data management tool which can get all the data together in one place and also with the right data model underpinning it. Because without that right data model underpinning it, you don’t get the same efficiencies.
James Milne:
And we’ve been talking about instrument mastering or security mastering. Those two terms are interchangeable, really. But it’s also in terms of that wider data mastering piece, so an instrument has attributes around it. So once you start getting into attributes such as your region and your country or your currencies, you get onto your legal entity structures for issuers and parent issuers, global parent issuers and you borrow or your broker hierarchy. Once you start to master all of those things together, your instrument obviously has various connections into those various other data points around it.
But once you have a master that covers not only your instrument mastering but also your wider data mastering and you can have golden records for all of those things, that’s when you really start to cover that whole data model in a golden record for all of your investment data management.
Stephen Huppert:
Things like golden record and the single source of truth tend to become almost cliches thrown around the data space. But I think they are so important to have that, to avoid the mess of data, the duplication and the inefficiencies that arise when you don’t have that single source of truth.
James Milne:
Absolutely. And getting back to the time saved. When you can go to a single place and you can query what are all of my instruments in a specific classification or type or region or country or currency, and then combine that with the portfolio holdings and start to drill into your exposure, or your currency exposure, accounts party exposure, et cetera, yeah, it becomes really powerful and becomes a bit of a game changer in terms of that versus going to a set of spreadsheets and trying to tie that data together to get an answer.
Stephen Huppert:
And the pressure to have more transparency, more disclosure, both with the regulator but also with the consumers, it’s becoming more and more important on both sides of the Tasman, I think.
James Milne:
Absolutely. And will only get more, I think.
Stephen Huppert:
Great. And AlphaCert is a great way to get that central source of truth.
James Milne:
Absolutely. Our key value proposition, really, is bringing that data together in one place so that we can make sense of it, we can validate it, make sure it’s correct, and we can get it into the hands of those users who need it in a timely manner, whether it’s downstream systems or into spreadsheet models, those type of things.
It’s a key part of our offering to solve those business processes and give you time back in your day.
Stephen Huppert:
Thanks, James. Well I understand instrument mastering a lot more now, so thanks very much.
James Milne:
No problem. Thanks Stephen, and we’ll see everybody on the next AlphaChat.