Everyone these days hear about Double dip recession, Euro dollar crisis, Greek debt crisis, Irish instability, Poland debt and now bond bubble. Lets understand some basis on Bond and why some economists and investors think that we may see similar trends as we saw in the housing bubble..
Availability of easy resources with no constraints on the borrowers coupled with greed on the part of investing companies to package the same debt into exotic and yet risky products and funnel it across up & down the financial industry chain and not realizing the simple truth that if section of the majority of borrowers were to default the cascading impact could swallow global companies like Lehman & Brothers, AIG, Bear Stearns into bankruptcy are yet again seeing the potential seeds being sown for the history in making.
Bond & How it works:
Countries, Corporations, Charities, Cities, Counties all need resources to fund their operations and to beef up the resources they could either take loans from Financial Institutions or issue bonds (its just a loan with contractual terms) in return for an interest to be paid for the term of loan period plus principal in layman terms. But the packaging of contractual terms and features coupled with market conditions and risks can make bonds very complicated.
The contract terms relates to Interest rate (aka Coupon rate), Maturity, Callability, Convertibility, Secured, Principal and are traded just like Equities either for discount or premium depending upon the quality of the Issuer and risk associated with this loan. All bonds quality, risk and returns are rated by major agencies like Moody, Standard & Poor and Fitch based on issuer's background and contract terms to enable customer to pick Bonds that suit his/her preference on risk & rewards. So in general higher the risk, higher is the yield an investor seeks for a given set of contractual terms. Note these ratings change over the life of the bond and that makes it complicated in a dynamic free market. The market pricing of the bond has an inverse relationship to interest rates.
These are also named as debt, fixed income or credit in the financial institutions.The size of the global bond market as of 2009 is estimated at $91 trillions and US leads the bond market with $32 trillions followed by Japan. Majority of the $899 billions of the daily trading volume happens between broker dealers and large institutions.
Individuals can invest in bond funds and an array of indices are available for managing the portfolio and measuring performance like Barclays Aggregate, Citigroup BIG and Merrill Lynch Domestic Master.
Junk Bonds:
These are high risk, low rated debt with a promise of high yields issued by corporations to entice the public to subscribe to their bonds with no guarantees. This type of bond is subject matter for this discussion.
With Interest rates being very low for a good period of time, investors are still not bracing the equity market for the fear of double dip recession but however are seeking steady returns by investing in these high yield junk bonds while ignoring early warnings reminiscent of credit bubble mainly credit worthiness.
In the last nine months, companies have already sold $172 billion in junk bonds as per Dealogic a data provider. Factors for this scenario appears to be, the declining rate of corporate defaults, corporations in general have a good balance sheet. Also, Interest rate in near future does not seem to rise based on assumption that Fed would unleash with a tidal wave of freshly printed US-dollars to avoid double dip recession and alleviate US Job market. This act by Fed is already put in full play by major countries like China, Japan, Russia & South Korea who have already printed additional their respective sovereign paper currencies in the market to make sure their currencies are not overvalued that would have negative impact on their exports.
More and more Corporation are seeking this route both with strong and not so strong ratings on their bonds. Companies with weak credit ratings are paying just 6.2% points above Treasury's down from 20% points in 2008. Companies with fewer protection covenants, provisions in their bonds but with high yield are gobbled up by the investors and this echoes similar symptoms we saw at the start of housing bubble where Financial Institutions were more than willing to provide loans even when Customer credit rating was not favourable or downright unworthy.
Investors with their appetite to find niche markets to plough and park their funds for returns hopefully have not yet forgotten the recent fresh memories of the credit bubble which are still lingering for majority of them.
Sunday, September 26, 2010
Wednesday, September 22, 2010
Use Oracle Hints or Not
There has been many a heated discussions about this subject among developers and DBA's and at times taking pretty ugly turn with one section saying "Yes" and others saying "No" as if to emphasize that we only live in Yes & No world. Some of our tuning experts who were brought onsite at a client to resolve performance issues brought their own arsenal of repertoire of "Do's" and "Don'ts" and their first option of tweaking the code was with hints in the queries. This I hilariously call "Injecting of Hints".
Is this a right approach? Did we understand what was root cause before put in a hint. Will this hint support the same level of execution speed should data in the underlying Join tables change? Will this support if the client moves on to different version of the Database? What happens if the hint is deprecated? These are some of the valid questions developers, DBA's should take into account before including one in the queries.
Oracle CBO optimizer is built to provide the best execution plan for a Query based on the statistics that it has at the time of execution. Should the characteristics of the underlying data change drastically and statistics on those changed objects is not captured then it is worth putting a hint around the query to execute it faster. But this type injecting a hint in the query should be temporary fix and not a permanent one.
Does this mean we should not use hints in the first place? No the questions begs for more details on Approaches, Goals, Complexity, Database Type (OLTP, OLAP, MOLAP, ROLAP), Query nature (Adhoc or Tuned), Data Maintenance (One time upload), any known issues with database, special query requirements that otherwise wouldn't work etc.
Oracle has broadly built different types of hints to address the above and many other scenarios and these hints as a Swiss knife like tool for any issues at Database, Session or for individual query requirements. (Remember Swiss knife is not equivalent to butcher knife if that is what you need in the first place)
This is one such area, Oracle has been improving upon constantly and in 11g has come up methods of collecting extended statistics. Collection of Stats using the Oracle DBMS_STATS package is results in manifold performance improvement over init.ora changes even by Oracle Gurus. This package has become the core heart for collection of stats and surprisingly many developers have just skimmed the surface of this package. Besides, using this package one could transfer Statistics from Production server to say Build server to compare the execution plan with varying degrees of data in the build environment.
Oracle recommends using the Automatic Optimizer Statistics method for Stats collection in a appropriate maintainence window for ALL objects in the database for which statistics are absent or stale and this avoids any need for manual intervention. Timing and setting up of the maintenance window is critical so it is run when processing is idle and data for the current or previous period are already loaded to avoid manual re-run for stats.
Oracle provides several flexible features for collection of Stats and user needs to understand these and take several important considerations while using this package.
Secondly I prefer to keep on top of the subject on software upgrades, improvements and implement these features as I go along on the development side rather than keep singing and humming that "old is gold" or "old was gold" rhymes.
Is this a right approach? Did we understand what was root cause before put in a hint. Will this hint support the same level of execution speed should data in the underlying Join tables change? Will this support if the client moves on to different version of the Database? What happens if the hint is deprecated? These are some of the valid questions developers, DBA's should take into account before including one in the queries.
Oracle CBO optimizer is built to provide the best execution plan for a Query based on the statistics that it has at the time of execution. Should the characteristics of the underlying data change drastically and statistics on those changed objects is not captured then it is worth putting a hint around the query to execute it faster. But this type injecting a hint in the query should be temporary fix and not a permanent one.
Does this mean we should not use hints in the first place? No the questions begs for more details on Approaches, Goals, Complexity, Database Type (OLTP, OLAP, MOLAP, ROLAP), Query nature (Adhoc or Tuned), Data Maintenance (One time upload), any known issues with database, special query requirements that otherwise wouldn't work etc.
Oracle has broadly built different types of hints to address the above and many other scenarios and these hints as a Swiss knife like tool for any issues at Database, Session or for individual query requirements. (Remember Swiss knife is not equivalent to butcher knife if that is what you need in the first place)
- Hints for Optimization Approaches and Goals (All Rows, First Rows etc)
- Hints for Access Paths (Cluster, Full, Index, Hash, RowID etc)
- Hints for Join Orders (Leading, Ordered etc)
- Hints for Join Operations (NL, Merge, Hash etc)
- Hints for Parallel Execution (Parallel, No Parallel etc)
- Additional Hints (Anti Join, No Cache, Materialize etc)
This is one such area, Oracle has been improving upon constantly and in 11g has come up methods of collecting extended statistics. Collection of Stats using the Oracle DBMS_STATS package is results in manifold performance improvement over init.ora changes even by Oracle Gurus. This package has become the core heart for collection of stats and surprisingly many developers have just skimmed the surface of this package. Besides, using this package one could transfer Statistics from Production server to say Build server to compare the execution plan with varying degrees of data in the build environment.
Oracle recommends using the Automatic Optimizer Statistics method for Stats collection in a appropriate maintainence window for ALL objects in the database for which statistics are absent or stale and this avoids any need for manual intervention. Timing and setting up of the maintenance window is critical so it is run when processing is idle and data for the current or previous period are already loaded to avoid manual re-run for stats.
Oracle provides several flexible features for collection of Stats and user needs to understand these and take several important considerations while using this package.
- Statistics Gathering Using Sampling (Estimate Percent, Auto Sample Size etc)
- Parallel Statistics Gathering (Degree of Parallelism)
- Statistics on Partitioned Objects (Local/Global, Granularity, Incremental etc)
- Column Statistics and Histograms (Understand skewness of data at column level)
- Extended Statistics (Column grouping for accurate selectivity within a group, Expression Stats)
- Determining Stale Statistics (Monitor Staleness on Inserts/Updates/Deletes, Truncation etc)
- User-Defined Statistics (Stats on User defined Indexes, Functions)
- Set Preferences on Stats collection package DBMS_STATS (Cascade, Degree, Granularity etc)
Secondly I prefer to keep on top of the subject on software upgrades, improvements and implement these features as I go along on the development side rather than keep singing and humming that "old is gold" or "old was gold" rhymes.
Digitizing Is the Way..
Is it a million dollar question today whether eBooks (digitized version of hard books) will become the dominant force and place the last nail on the coffin named "Demise of Brick & Mortar based companies" such as Border, Barnes & Noble and a host of publishing houses? You bet, going by the last quarter earnings from Amazon for this year are any indication and direction in which this industry is moving. Amazon reported that digitized version of their books outsold hard cover copies for the first time and research folks all agree that this is not a blimp but sure indication of things to follow in the market.
Today (September, 22, 2010) Wall Street Journal carried a title “Web Start-Up Values Soar” providing a picture of overcrowded manufacturers of tablet computers from makers such as Apple (Ipad), Dell (Streak), Samsung (Galaxy Tab), RIM, ACER, Asus (Eeepad), Lenova(LePad), Cisco(Clus) who are potentially trying to catch up this very important paradim shift in consumer habit & behavior.
Also, eReader manufacturers (SONY, Amazon, Barnes & Noble , EBS Technology, Condor Technology Associates, Kogan Technologies, Spring Designs, Kobo Books, iRex Technologies, Wolder Electronics, Onyx International, , iPapyrus, Hanvon) are also playing a big role in accelerating this movement of digital era world wide with different flavors and support level.
One can pick & choose a eReader based on its file format support capabilities ( .epub,.pdb ,.arg ,.azw ,.djvu ,.html ,.lbr ,.lit ,.mobi ,.opf ,.pdf ,.pdg ,.tr3 ,.txt ,.xeb ,.mp3) preference features such as (Size,Weight, Screen Resolution,Shades,Operating System,Touchscreen,Wifi,Text-to-Speech,Integrated dictionary directory organization capability,Amount of memory Card,Reader user replaceable battery,Web Browser).
This change will alter the dynamic relationship between Authors, Publishers and retailers along the lines on profits in the $40 billion a year retail book industry and digital publishing globally which is around $480 billion. The direct sale to end customer will eliminates the need for shipping, warehousing, returns, inventory of stock and out-off print scenarios resulting in lower cost to a consumer.
Authors now have the ultimate freedom and opportunity to show case their work while avoiding agents and long approval process at reputed publishing houses. Rather websites like http://www.scribd.com/ would provide window for any author to publish their work and earn 80% of the sales when compared to 7 to 15 percent royalty that would otherwise be paid.
There are still many challenges that needs to be addressed before it makes hard covers/books obsolete. These newer gadgets (eReaders/Tablets) need to be cost effective and should be able to download any article from any publishing source for a price. There should be an exchange market made available to exchange one article with the others to proliferate the usage of digital articles.
Next Trends:
Innovative digitizing mainly for Text book/Fictional ones with appropriate drill downs with 3D features on Computer tablets could be the next wave that customer can't wait to see...
Innovative digitizing mainly for Text book/Fictional ones with appropriate drill downs with 3D features on Computer tablets could be the next wave that customer can't wait to see...
Monday, September 20, 2010
Rebuild Financial Data Model: Start with Security Master
Drivers for rebuilding efforts:
Regulatory, Risk management and KYC (Know Your Customer) compliance regulations are the major drivers that is currently driving several key initiatives worldwide in Financial Industry.
Regulatory, Risk management and KYC (Know Your Customer) compliance regulations are the major drivers that is currently driving several key initiatives worldwide in Financial Industry.
Major Benefits:
Among these initiatives Security Master for Stock & Fixed Income Instruments at enterprise level is a potential candidate one that could provide big gains and maintenance ease within their enterprise Financial Data model besides providing data accuracy, flexibility to its various downstream applications from one single source of truth.
Current Design Issues:
“Security Master” that exists in the current form and shape are evolved over period of time and carry a heavy baggage of processing and huge of amount of data redundancy even for basic reporting needs. Secondly these evolved ones are built on mainframe or older technologies and are currently not scalable or flexible to demanding customer reporting requirements.Thirdly the cost to maintain a single truth on instruments turns out be pretty expensive proposition because of copies of the same data set exists in different silos with various degree of relevance by time periods.Lastly they are built around outdated batch process and are not real-time.
Among these initiatives Security Master for Stock & Fixed Income Instruments at enterprise level is a potential candidate one that could provide big gains and maintenance ease within their enterprise Financial Data model besides providing data accuracy, flexibility to its various downstream applications from one single source of truth.
Current Design Issues:
“Security Master” that exists in the current form and shape are evolved over period of time and carry a heavy baggage of processing and huge of amount of data redundancy even for basic reporting needs. Secondly these evolved ones are built on mainframe or older technologies and are currently not scalable or flexible to demanding customer reporting requirements.Thirdly the cost to maintain a single truth on instruments turns out be pretty expensive proposition because of copies of the same data set exists in different silos with various degree of relevance by time periods.Lastly they are built around outdated batch process and are not real-time.
Security Master A Data Store:
Security Master is a Data Store holding an Integrated view of Financial securities information of Issuers, & Issues from Global exchanges and are easily identifiable with cross reference identifiers like CIN, ISIN, CUSIP, Ticker Symbols, Industry classification, exchange listing, dividends, splits, Ratings (From Moody, Standard & Poor, Fitch), Yield Rates etc.
Security Master serves as a "Universe of Securities" addressing the various subject area needs for Equities, Fixed Income, Swaps, Repos, Commodities, Futures, Derivatives, Funds, Options, Indexes, Security Lending, Custodial Services, Asset Management, Alternate Investments for both operational and reporting needs with different data set requirements. Security Master is a subset of the Integrated Financial Data model in many companies.
Security Master is a Data Store holding an Integrated view of Financial securities information of Issuers, & Issues from Global exchanges and are easily identifiable with cross reference identifiers like CIN, ISIN, CUSIP, Ticker Symbols, Industry classification, exchange listing, dividends, splits, Ratings (From Moody, Standard & Poor, Fitch), Yield Rates etc.
Security Master serves as a "Universe of Securities" addressing the various subject area needs for Equities, Fixed Income, Swaps, Repos, Commodities, Futures, Derivatives, Funds, Options, Indexes, Security Lending, Custodial Services, Asset Management, Alternate Investments for both operational and reporting needs with different data set requirements. Security Master is a subset of the Integrated Financial Data model in many companies.
This Data Store gets its daily updates either on demand or or real-time from several major vendors like Bloomberg, JJ Kenny, IDC and others.
Some of the Key Data Sets that could potentially comprise or included in the Financial Data Model are:
Fundamental Data S&P Compustat (Basic) Annual and Qtrly BackData (S&P) Worldscope Market Guide (Multex) Valueline Zacks Fundamental Research w/Ratings Ratings Direct Global (S&P) Ratings Direct for Muni’s (S&P) Fitch IBCA – Muni’s & Corporates Moody’s High Grade/Bank Research Moody’s Leveraged Finance Pricing Databases Fund Runs (FT Interactive) Merrill Lynch JJKenny MuniView (FT Interactive) Telekurs Bloomberg Data Other Fundamental Research Ned Davis (Via Factset) Quote Systems Bloomberg ILX Reuters RTW Instinet TM3 (Thomson) Marquee | Analytics Factset Salomon YieldBook Baseline Wilshire Atlas Reuters R&A MMD (Thomson) PCPoint (Lehman) MONIS KBC Financial Vestek Events/Conferences Alerts CCBN (Includes Street Events) Transcript Service (CCBN) Event Briefs (CCBN) Benchmarks MSCI Indices S&P Index Alert S&P Constituent History Russell Indices FTSE Proxy Service ISS (Institutional Shareholder Svcs) |
In some organizations, Security Master is closely linked with Corporate Actions data which mainly consists of the following types:
- Cash Dividend
- Stock Dividend
- Stock Split
- Name Change
- Domicile Change
- State of Incorporation
- Round lot change
- Ticker Change
- Re convention
- Re denomination
- ID Number Change
- Equity De-listing
- Change in listing
- Equity Listing
- Variable Interest Reset
- Voting rights Change
- Currency Quotation Change
- Shareholder meeting.. etc
Fixed Income Instrument master Attributes:
- Security id
- CUSIP
- ISIN
- SEDOL
- Short name - short name for the security.
- Type - The first level of security classification.
- Sub type – The second level of security classification.
- Country of incorp - country where the issuer company is incorporated.
- State - The state of the security. For example active, defunct, bankrupt and so on.
- Last status change date
- Rating
- Principal payment frequency and other details related to it like last date and next payable date
- Issuing details
- Issuer name
- Face value
- Paid up value
- Non paid amount
- Start date - The date on which security is created in the market.
- End date
- Underlying security - Another security associated with the security as in the case of ADR’s.
- Contractual income -cash dividend is paid contractually.
- Convertible - indicates security is convertible to another security
- Taxable - indicates that security is taxable.
- Contractual settlement - indicates contractual settlement is allowed. This is usually used in trade settlement.
- Issue currency
- Stock exchange listings
- Units
- Price
- Volumetric details
- Legal restrictions
- Interest details
- Factor
- Percentage outstanding
- Pool factor details
- Other linked security details like the specific instrument, type, qnty etc
Choice of Data model becomes a key for any successful Security Master Data hub implementation. Some organization have build their own while some have purchased Financial data models off the shelf from vendors like FTI, Golden Source and have implemented Security Master. There are two sectional pieces in any Financial Data model. One which changes often like Pricing Data and others which changes rather infrequently like 'Issuer information. Nevertheless a Financial Data model should support as a "Universe of Securities" with tight linkage between its various functional Data Sub Models.
Data Load Enablers (ETL/Scripts):
The Data load enablers can be synchronized either through generic scripts are via ETL Jobs to Update/Insert newer data with appropriate Data Governance to load data into various entities of the Financial Data Model. There are four phases in any Data load process.
Data Cleansing
- Data Cleansing is the act of detecting and correcting (or removing) corrupt or inaccurate records from a record set.
- A data profile will be carried out to check inconsistencies in data
- Standard corrections will be performed after the extract process
- Theses corrections will be performed by validating and correcting values against a known list of data issues.
- Standardization helps us to transform inconsistent data into one common product or entity representation.
- This includes uniform abbreviations, correct spelling, formatted patterns, etc.
- This task will be performed by building a consensus with data stewards and owners on acceptable standardizing methods and values.
- Matching is the algorithmic comparison of two or more sets of records which relate to the same individual or entity
- Determine, based on the profile of data and amount of duplicates, the applicability of a Probabilistic or Deterministic match algorithm
- Load data into the Matching Technology Tool to produce matching Results
- Review results, tune algorithm if desired and re-run algorithm.
- Based on match results and its review, set merge and link rules
- Produce merges/links
- Manually review merges/links over a sample set of data
- Determine suitability and “Go-Live” on finalized rules
Data Governance is an emerging discipline which marriages Data Quality, Policies, Processes and Risk on handling of data in an organization.Organization plays a key and vital role in Data Governance and Non IT personnel are required in data decisions and IT should not make these decisions by itself. Data ownership at Top Tier management should be the main concern.Regulation, compliance (SOX, Basel II), or contractual requirements call for formal Data Governance today in Financial industry.
- Organization’s main focus is around Data Quality and hence the main objective for running Data Governance program.
- The Organization should build clear decision rules and decision making processes for its shared data.
- Organizations should plan to move towards from Silo ed data management to Integrated Data management so that quality and governance can be assured at an Enterprise level.
- Organizations should increase the value of data assets and have a robust mechanism to resolve data issues.
A Security Master Framework
ISO Standards:
Thousands of companies follow ISO Standards while integrating their process for Data Convergence, Consumption & Publication of Quality data feed. Following are the list of ISO Standards relating to Financial Instruments that could potentially be used while building Security Master:
- ISO9362 (BIC Bank Identifier Codes)
- ISO3166 (Country Codes)
- ISO4217 (Currency Codes)
- ISO10383 (Exchange/Market Codes)
- ISO6166 (ISIN Security Codes)
- ISO10962 (Security Types)
- ISO2014 (Date format)
- ISO8532 (Certificate Numbers)
- ISO 639 (Languages)
Subscribe to:
Posts (Atom)