Tuesday, December 10, 2019

California Consumer Protection Act and Data Governance Initiation Steps


California has the largest economy of any US State. In fact, if it were its own nation, it would be the fifth largest economy in the world with a GDP of $2.9 trillion Forbes (2018). It's quite obvious then, any new regulations and rules passed by the State with its 40 million population would certainly have a noticeable impact to many Organizations who serve its residents.

California Consumer Privacy Act is one such regulation that will go into effect on Jan 1, 2020 to protects its residents of their Privacy and Protection rights. My attempt here is to elucidate the need to build an effective Data Governance Framework not just to meet regulatory compliance requirements but also to effectively manage the burgeoning of data ingestion's with effective Business Processes, Controls, Fit for Purpose of the data sets in Analytics & Reporting thus increasing the trust factor coupled with reduction in Cost.

CCPA
Californian residents will have following rights from Jan 1, 2020 under CCPA
  • Know what personal information is collected about them for the prior 12 months by Organizations 
  • Know whether their personal information is sold or shared for a business purpose and to whom for the prior 12 months 
  • Access their personal information, with limited rights to delete or opt-out of sales 
  • Equal service and price even if they exercise their privacy rights

Comply
Any Organization which meets any of the following criteria needs to comply with CCPA
  • Any Organization whose Annual revenue is in excess of $25M 
  • Obtains data of 50K of Californian residents annually 
  • Derives over 50% of revenue from selling California residents personal data

PIA Impact Study
Every Organization dealing with Californians residents’ information can potentially perform a Privacy Impact Assessment (PIA) by answering the following key questions.
  • Is the Customer information storage dispersed and # of copies exists with its own flavors?
  • Who all have access and Is there some level access controls with audit on their usage with PI attributes contextually anonymized?
  • What are the legal or compliance business processes that surround the PI information and its data sharing agreements Internally and Externally?
  • Are there policies built around Business context as well the Legal context for usage of Customer PI information?
  • Is there a tribal knowledge among the various Departments and its line of business that needs to be articulated, standardize and documented?
  • Are there Data Stewards or Owners who manage and identify PI attributes with knowledge of risk remediation steps to be taken for any breaches?
PIA study will enable Organizations to built Use Cases which can then be prioritized. Executive Sponsorship and Support is very crucial as this initiative endeavors cross functional communications with different groups and often is a Journey rather a Sprint.

Checklists
Preparing a checklist could be taken as the next step to capture the requirement details from identified Department Stakeholders and Data Stewards
  • Controller Checklist to assess Lawfulness, Fairness and Transparency of individual rights, data security, data transfers and data breaches
  • Processors Checklist includes documentation, accountability as well approach to individual rights on the Processors side
  • Information Security Checklists includes how businesses handles management and Organizational Information Security
  • Direct Marketing and Records Management Checklists
  • Data Sharing and Subject Access Checklists
  • CCTV Checklist on Close Circuit Television and Internal Cameras installed and its impact on privacy of Employees & Customers

Data Protection Impact Assessment (DPIA)
Start small and identify business lineage of importance to the Organization based on the checklist findings of relevant Use Cases. One may also utilize templates provided by tool vendors like Collibra Data Governance Center, IBM InfoSphere Information Data Governance, SAP Master Data Governance to document your Processes and Workflows
  • Conduct Risk Assessment to check and assess the maturity of risk management framework with any predefined workflows 
  • Identify and validate if the Processors of Individuals data rights are in alignment with the CCPA
  • Identify Remediation Plans to control or identify reduction in steps of risk severity aspects for any breaches with accountability to boot
  • Validate if there is a built-in sufficiency mechanism of technical and Organizational measures to help assess threshold scores
  • Identify Sharing risks of data with external party and its remediation actions

Non-Compliance of CCPA
The major risk for any Organization for any breach is the loss of Trust and its Reputation impacting its growth more than monetary fines as indicated in the CCPA
  • The following are monetary fines laid out in the CCPA
  • $100 - $750 per consumer per incident or actual damages whichever is greater 
  • If Organizations fail to cure any alleged violations within 30 days the following fines are enforced
    • 2,500 in civil penalties for each violation
    • 7,500 for each intentional violation
Residents Consent and Rights
Organizations do not need a consent from data subjects to use their data. However, safeguards and provisions need to be in place besides provisions for the subjects to implement any Right to Erasure or Right to be Forgotten provisions. Although there are some exemptions, but these depend on case to case basis and certainly does not provide any business with carte blanche to keep or use Customer information.

Finally, there are already many such Regulatory & Compliance Laws in place including New York Cyber Security Regulations, GDPR for European Countries, PIPEDA in Canada, China’s Cyber Security Law and potentially many more are in the horizon. As a result many Organizations are in already in some pipeline of building a robust Data Governance over its Data Assets with Process & Access Controls with ample Risk mediation actions.

Wednesday, November 20, 2019

Data Governance Framework Maturity


According to the 2019 State of Data Management Report from 863 participants across Globe by Profisee, Data Governance is one of the top 5 Strategic initiatives in 2019. Advance trends in Machine Learning and AI has bolstered the digital transformation initiatives globally along with availability of full fledge DG (Data Governance) platform products like Collibra DG, Erwin DG, Informatica DG, SAP Master Data Governance and others have accelerated its adoption.

One must evaluate these platforms as they differ in terms of its features and its capabilities like the ability in providing Native Connectors, API’s and Webhooks to synch data with various Applications and thus enabling to build workflows to automate data rather than manually synch them up. Some platforms also provide Cloud and/or Server options and while some provide Hybrid Option with Secured Gateways.

Recent high-profile incidents at Facebook and loss of Customer data at regular intervals from high profile Organizations has also brought back Data Governance and Security into the front pages and into limelight with the Executives. Secondly, most Enterprises have realized that by maturing Data Governance, one could measurably benefit from Quality, Transparency and Trust in their Reports, Measures & KPI’s and thus improvement in the bottom line.

Maturity
An Organization can be grouped into any one of the 6 different maturity levels from an Immature to Sophisticated Maturity Levels and this can easily be validated by conducting an internal Survey and Assessment of their current State.

Instead of boiling the Ocean, Organizations can identify the Business Verticals of importance in their EIM Model along with its domains with a prioritization schedule. The most important requirement for a successful Data Governance engagement is to get an Executive Support all the way in its implementation.

Building a smaller yet nimble Organizational structure with clarity on Collaboration, Engagement, Transparency & Responsibility of efforts among various stakeholders, Data stewards is a prerequisite for an effective engagement.

Conducting roadshows on Success Stories to build more Successful ones, does keep the foresight needle straight with gusto as the Journey being bit long towards reaching its Zenith phase along the Data Governance Maturity Curve.

Drivers
Here some key major data points that can be used to evaluate DG Maturity level Curve in an Organization. (If you agree on 5 or more data points, your Organization may be a good candidate to mature its Data Governance Framework)
  1. Lack of Trust in Critical and Significant Enterprise Reports
  2. Lack of Data Lineage of Reports to Models and Reports
  3. Lack of Transparency and Agreement to Measures and Metrics in Business and Cross Domains
  4. Lack of Data Stewardship, Ownership of Line of Business Data Assets
  5. Lack of identifying Redundant Reports and its Usage
  6. Lack of Role Based Access Control to PII Data
  7. Lack of Data Catalog & Business Glossary, MDM and Reference Data
  8. Lack of Data Quality Scores on Data Sets
  9. Lack of Ability to Tag Data Informational Tags by Users and Viewers to Organize its Usage
  10. Lack of Issue Stewardship & Resolution documentation process for repeat data issues

Conclusion


Organizations are maturing their Data Governance environment in their current ecosystem including Data Lakes in building and streamlining Data Governance Principles, Policies & Standards to Curate and build Conformed layer for Fit to Purpose of Enterprise Data Assets to enhance its high degree of Value and Trust for Self-Serve of Data by its Data Citizens.

Saturday, August 3, 2019

RPA Robots in Digital Transformation


Software Robots mimics human interactions not just Mouse Clicks, Key board Navigation's, Application Logins with Web, Desktop application but much more are often called Bots that are programmed to do repetitive or sequential tasks of Workplace Business Process Operations 24/7.

These Bots are not only cost effective and time enabler but support many of the Organizations digital transformation initiatives to achieve profitability and remain competitive. Global spending according to Gartner will pace a total of $2.4 billion by 2022 and nearly 85% of large and very large organizations will have deployed some form of RPA.

The biggest early adopters included Banks, Insurance companies, Utilities, Telecommunications and Transportation companies. Moreover any Industry's Workplace Business Operation which knits its Business Processes with different applications and platforms, or work involves mundane tasks performed manually are excellent candidates for RPA.

The ever growing adaptation of RPA by Industry is on account of its Low Code/No Code features with Drag and Drop features using robust framework and thousands of Library of Functions for Task Events while facilitation Monitoring & Scheduling thousands of Bots from a Control Room. 

Compare this with all the nine yards involved in Coding efforts which requires extensive Business Process requirements gathering and spending enormous time in search for highly skilled developers to build Robust, Modular and Scalable solutions.

Today’s Bots are easily built using Robotics Process Automation (RPA) tools offered by different vendors to support and improve accuracy and efficiency by automating repetitive tasks using Desktop Applications, Custom and Native application seamlessly. Some products also have Cognitive supportive capabilities like Machine Learning and AI with various degrees of maturity. 

RPA is now a new branch of IT Specialization area in the World of Automation euphoria and that is here to stay and is maturing very fast to meet demanding Business Workplace Process needs. 

Selecting an RPA Vendor tool from among the current offerings, one needs to focus on its features that are already build and its ease in building & deploying Bots along with support for its Maintenance, Monitoring, Scheduling and Reporting. One of my common thumb rules as well a proven one is to favor products which have Open Community Support with larger base that potentially would have well rounded features coming from the ground zero and also would have higher staying power in this fast-newer emerging technology.

An RPA tool should also have support for multiple platforms like Windows and Linux which are common workplace OS platforms to support various Desktop applications such as MS Office and Google G Suite. 

Many medium to large Organizations have either a large Offshore/Onsite support folks for their Business Process activities and hence any RPA product should also fully support VDI infrastructure products such as Citrix, Amazon WorkSpaces, IBM Cloud, VMware Horizon Cloud and other major VDI products.

RPA Bots Utilization
RPA Robots are game changers that can automate and perform very large repetitive transactions in simple or complex Organizational Business Operational processes. Robots can also be built to automate and streamline complex business interactions by integrating various operational tasks with multiple applications systems and channels. 

Here are some simple to complex RPA activities that can be automated   
  1. Opening user emails to download specific attachments into a network folder
  2. Moving files from a source directory/folder to another target source including from and/to FTP server
  3.  Logging into ERP or Native or Web applications and updating specific forms using data from files including CSV
  4.  Scrapping data from Websites to extract selective data such as Stock or Ticker Prices or in its entirety data sets
  5. Scanning PDF including support for OCR to convert them into text or hand written documents
  6. Selecting specific fields from Contract documents using AI based pattern matching
  7. Connecting to ODBC, JDBC supported Databases like Oracle, SQL Server to perform CRUD Operations
  8. Streamlining and supporting seamlessly major and key Business Operations by building workflows across multiple native custom applications using virtual user concept in the absence of API’s

Some key Industry Use Cases
  1. Data Intensive Operation Companies that Extract & Capture data by Scrapping the Web sites and converting data from different media types & formats including PDF with OCR requirements
  2.  Finance Organizations benefiting by accelerating their Mortgage Processing, Fraud Analysis tasks, Configuring and building business rules for Loan Origination, Automating tedious tasks like Bank Reconciliation reporting from different applications
  3. Facilitating faster Clinical trials and drug approvals oversight of Pharmacovigilance (PV) cases by reducing errors and enhanced compliance in a reduced process cycle time
  4. Automating Transportation and Logistical & Business Operations, Shipment Scheduling & Tracking by collating data and building Workflows thus increasing efficiencies and reducing costs
  5. Retail industry integrating their Invoice Processing and Credit Collections to streamline End to End Order-to-cash processes
  6. Telecommunication industry many manual repetitive rules-based processes
  7. Utilities Industry's highly traditional regulation driven marketplace rule driven process
  8. Lastly any Organization whose Business process are Rule based driven and are manual or Process interweaves through manual extraction of data data from many Native or Custom Applications including Web with tedious tasks can be automated using RPA Bots
Major RPA frameworks have fully functional GUI studios to build Robots using Drag and Drop features with enriched library in much shorter time window with Low Code No Code development efforts. One could also build Self learning Intelligent Robots that learn from repeatable actions using ML & AI features.

Saturday, July 27, 2019

IT Automation Implementation- Data Points


In today’s world, the word Automation conjures lot of confusion as well scares many workers as a disruptive technology that aims to eliminate ones jobs. The fact is its going to redefine the roles and process of their job functions rather than targeting them towards its elimination.

There is an interesting article by McKinsey's "Four Fundamentals of Workplace Automation" which states ALL jobs are bound to be impacted with various degrees of process automation at workplace from a CEO to Clerical Jobs.

Because of its negative bias opinion, Organizations need to tread cautiously and comfort its workers that an IT Automation effort is not aimed at individuals rather than redefining their tasks and activities that they perform in reducing and automating mundane ones.

To implement any automation of Business Operations, we should first rationalize its Operations in its entirety and identify and remove/modify redundant and inefficient process steps rather than simply build automation scripts on the top of its Standard Operation Procedures (SOP). It is not enough just to replace human interactive operational steps which is frown with many exceptions coupled with lack of standard documentation and thus difficult to program and handle.

Efforts spent on understanding and streamlining the business processes will certainly help not only to avoid costly and redundant operations but also enables to build manageable building blocks like Lego blocks.

Also, any initiative if not properly thought through, can spiral into chaos, business disruptions, cost and worst frowned upon by employees with distrust resulting in higher attrition. Needless to say any unplanned automation without Goals & Objectives and proper identifiable measures and with right skills and tools is fumbling at the starting block.

IT Automation Implementation - Data Points
Here is my small attempt to identify some data points that could help Automation stakeholders and managers to help them place the right foot forward and avoid costly mistakes where they don’t have experienced Automation Czars or Evangelists to help them implement and navigate.
  • Define Automation Journey with Strategies
    • Identify Pressure points that are currently dragging productivity
    • Identify Challenges and potential bottlenecks for improvement
  • Get the biggest payback
    • Identify low lying fruits with its value
    • Do not try to automate every part of the Process at one GO
  • Validate Automation feasibility
    • Build POC’s as required and build and identify necessary steps to scale
    • Identify Complexities and time required to execute them while keeping a shorter window of weeks rather than months and years
  • Identify building blocks & framework for readiness in Automation
    • Rationalize and fine tune bulky or miniaturize monolith process steps
    • Build automation capabilities with skills & tools to manage the new Business Outlook Operating Business Model
  • Identify meaningful test with regression tests and recovery strategies
    • Identify and avoid any business disruptions as these can be costly
    • Group similar automation initiatives under automation Evangelists & Stewards
  • Identify feasibility of Automation within Product's platform
    • Identify Time based workflows, Trigger or Event based automation within the Platform, Application
    • Build with minimal wiring from external Software agents to trigger Automation work rules or notifications
  • Design & Build Automation efforts that are sustainable, maintainable & repeatable
    • Build manageable process steps with proper exception handlers
    • Build process steps that are configurable with external inputs and can be executed in parallel
  • Define Operating Model and Governance along with Goals
    • Raise awareness, get an alignment with folks impacted with Automation and address any negative and unfounded assumptions
    • Rationalize initiatives and identify Growth or Process improvement Opportunities and identify factors with measurable units towards Goals
Automation is a Journey and not a 100 meter dash and hence measure them at regular interval its benefits. Also, as McKinsey article indicates, Automation is not all about targeting low-wage roles but all ladders of workplace jobs including the highest paid occupations in the Organization.

Friday, March 23, 2018

Consortium Blockchain a Right Step


Blockchain is a very disruptive technology and if industries do not keep in pace both in terms of its understanding and impact, experimentation and adaptation with changing ecosystem could potentially lead its enterprise leader's fear of missing out (fomo) psyche becoming a reality as it did for brick & mortar companies when they missed the opportunities with the advent of Internet in the early 90’s.

Today, to be on the safer side many enterprises are forming or joining the Consortium and testing their feet not to be left out. Global financial companies lead the way followed by Cross Sector and Life Sciences & health care enterprises.

The significance and ripple effect of Bitcoin Cryptocurrency tsunami which is based on Blockchain made Industry leaders to pay attention bit more closely to its underlying technologies. Blockchain technology is not new, but how the Distributed paradigm of hardware (nodes) including the Distributed Ledger Technology (DLT) has been leveraged coupled with removing the fear of security on identity and the need for an intermediary of controller’s oversight to validate the truth of the transactions.

Smart Contracts implementation on the blockchain with a semi-trust control on transactions validity with heighted identity security by different niche platforms like Ethereum Consotium, Hyperledger Fabric Consortium (from Linux foundation), R3’s Corda, Quorum, Chain Core made it much more enticing for the industry leaders. Not to be outdone, major Cloud vendors including IBM, Microsoft, AWS made these platforms available on their portals along with development toolsets to build and deploy enterprise based applications.

Consortium Blockchain features
1.       Platforms support Privacy and Confidentiality for all the participants in the network
·         It is built on trust where participants know that all of their transactions can be traced
·         All network participants identities are backed with cryptographic certificates that are tied to its Organizations, network components and end users or client applications
·         Blockchain supports all business exchange of assets of tangible (Cars, Whole foods) to intangible ones like Futures, Intellectual properties etc
·         Only known participants can participate in the transactions thus it is private and permissioned system
2.       Distributed Ledger records transactions are built in a decentralized and collaborative way
·         Transactions Assets are defined using a collection of key value pairs (JSON) with digital signatures of every endorsing peer node with values that can be encrypted
·         Ledger is a sequenced, tamper-resistant record of all state transactions chained together in blockchain with one ledger per channel and making it immutable
·         Each participant will have their own replicated copy of the ledger besides being shared with its participants
·         There is a single place of Origin for all transactions in the blockchain to provide provenance for the transactions
3.       The ledger functions and control are implement using Smart Contracts to automate and execute mutually agreed upon by the participants
·         Privacy is key for B2B model and this is implemented using Smart Contracts and Channels
·         Smart contracts are written in chaincode which enforces the business logic in defining assets and transactional logic
·         Channel’s ledger contains a configuration block defining policies, ACL and other pertinent information
·         Major programming languages like C++, GO, Java, JavaScript are used to program the chaincode
4.       A process called Consensus is used to commit the transactions only on approval by the appropriate participants
·         Consensus allows for transactions to be synchronization across network thru this process with shared data ledgers as well shared programs that update them
·         Transactions are written in the order in which they occur in this consensus process to avoid any malicious entries
·         Transactions are committed only when the block’s transactions have met the explicit policy criteria checks and balances dictated by the members and endorsed, validated and versioned checked

·         Consensus mechanism utilizes different technology messaging brokers like SOLO, Kafka, Simplified Byzantine Fault Tolerance (SBFT)

Use Cases
There are many uses that are identified in various industries including Capital Markets, Financial Services, Healthcare, Government Legal/Regulatory, Insurance, Supply Chain Logistics, Anti Counterfeiting, Travel and these are in various stages of implementation, development, design and PoC’s. Here are some:

Healthcare
  1. Large Pharmaceutical companies are using Consortium blockchain platform for their clinical trials to overcome its higher cost and meet Government regulations. Clinical trials involves data collection and its management in timely fashion from patients, physician offices, recruiters and research centers spread around vast geographical areas.
  2. Medical Insurance claims is a 3 trillion dollar US healthcare market but still utilizes 40 year old EDI paying value-added services in validating the transactions along with a significant percentage of clerical staff processing the paper input to service long tail of small scale providers.

Supply Chain Logistics:
This use case is an ultimate for Blockchain implementation. There are two key dimensions that are required in any Supply Chain from its creation to end-delivery. 
  • End-to-End traceability with ability to trace back to its Origins which potentially increases with complexity of the product in varying degrees 
  • Identifying passage and impact of laws and regulations influencing the product’s journey
The complexity can further compounded when supply chain has to trace assembly of these products into SKD’s and into final product. Imagine If I extend the thought process to trace including installation issues, repairs, upgrades, warranties etc. 

Capital Markets:
Currently 26 member Japanese financial institutions have successfully implemented Internationally OTC (Over –the- Counter) derivatives using blockchain to apply master agreement without a need to renegotiate with counterparty thus improving transparency and simplifying data management.

Recently Credit Suisse and ING completed the first live securities lending transactions valued EUR 25 million using R3’s Corda Blockchain platform.

Conclusion:
A methodical study, evaluation of a disruptive and revolutionary paradigm such as Blockchain Technologies to its current landscape of ecosystem has more promises than failures.

Saturday, March 3, 2018

FHIR a Strategic Initiative for Healthcare Providers to build Clinical Data Repositories


Today’s Hospitals are collecting vast  amounts of data from their day to day operations on Patients like Clinical Observations, Conditions, Encounters, Medication, Pharmacy, Imaging including vital signs using sensors to support both Operational and Research requirements. They are also looking at ways and means to map their internal data with external data which conform to some standards for ease of mapping and contextualize the data.

However the major problem they face is in their inability to combine the data from multitude of silos created by different applications for any meaningful and well-rounded analysis either for research perspective or to support any policy decision making or provide better and cost effective analysis to improve Patient care.

Many major hospitals have built healthcare data storage repositories in their native format with metadata tags to identify the context of the data what they call Data lakes which can be queried and extracted based on questions they wish to answer.

The success and utilization of these Data lakes depends upon how lighter they have been designed and built in comparison to dark data silos and to minimize its inherent drawback limitations to address questions such as; how the data was brought in; how and where it can be found; how to explore; and what and how it needs to be transformed to be of use. This task of identifying, preparing, combining several data sets from different hierarchical depths is equally challenging for a NON-IT-STAFF members like Researcher and Business users if not equally easy for the IT folks without Health Domain knowledge.

What and How FHIR (FAST Healthcare Interoperability Resource) an Open Industry Standard from HL7 comes handy to elevate such and many other bottlenecks?

HL7 a Non Profit Organization accredited in building Healthcare Standards, introduced FHIR (pronounced as “FIRE”) www.hl7.org/fhir/ . 

Healthcare providers have been using various HL7 Open Standards for Exchange of Information in the past since 1987 like HL7 V1, V2, V3, CDA, CCDA. FHIR Framework standards were written to address various pitfalls of their earlier specifications and standards and also keeping in view advances in IT technologies that can be leveraged for its effective implementation.

FHIR is the new Open standard getting lot of traction in the Healthcare Industry that has defined more than 180 Granular and Normalized Entities with attributes with defined formats. These “Containerized” entities are called Resources and in NoSQL World called Collections and its rows as Documents. FHIR also provides RESTful API protocols for exchanging of information between legacy healthcare systems and for integrating with different application systems. 

The common data format standard used in FHIR are XML and JSON. The concept 80/20 rule is applied when Industry identified these distinct resources in the current draft of HL7 Version 3.x from value proposition to mean 80% of the Clinical data needs can be accomplished using 20% of the Resources identified by the Industry.

These FHIR Resources are grouped under various heads such as Individuals (Patient, Practitioner, Person, Group), Diagnostic (Observation, Specimen, ImagingStudy), Medications (Medication, Immunzation, MedicationRequest),  Care Provision (CarePlan, ReferralRequest, RiskAssessment), Management (Encounter, EpisodeOfCare), Workflow(Appointment, Schedule, Task) for Clinical data sets and many more to support Financial Domain Coverages like Billing, Payment and various Specialized Health Research domains.

FHIR Specifications and Standards are exhaustive and detailed and provides standards how Profiles and Extensions for Elements and Data Types can be extended for attributes in Resources and to combine Resources for different Use Cases.

The most conspicuous benefits of FHIR is the ability to  build a conformed standardized yet unified view of common data sets/documents that are interoperable and shared with precise definitions both internally and with external systems and vendors.

The other benefits includes its support for Clinical Terminologies and Ontologies for SNOWMED, ICD9/10, LOINC and other Open Standards thus avoiding to design a separate Terminology Services with Codes for Lab noting’s, Diagnostics, Medications, Imaging and others.

Lastly, the most important one for me, FHIR Resources can also be presented in an RDF format (Linked Data) specification by serializing property information using Turtle format or as JSON-LD and presenting data in RDF (Resource Description Framework) data model to support Graph DB. A RDF Graph DB(SPARQL)  can help and enhances Healthcare provider’s ability to identify complex patterns of relationship in real or near real time that could save lives and decrease costs to Patients.

FHIR Implementation
FHIR can be implemented in a phased manner depending upon how many different domain repositories and areas of interest that Healthcare provider intends to build and leverage insights. HL7 also provides HAPI pronounced “happy” a JAVA based health care package library to enable adding FHIR messaging to your applications in building different FHIR Resources.

NoSQL Database
Todays, Key Value store are the norm for its flexibility in building a schemaless and horizontally scalable databases that supports object oriented paradigm. To this add performance compared to RDBMS as you scale up data volumes into billions of rows or terabytes/petabytes data and also to meet high demand throughput with low latency traffic there is no next best alternative to NoSQL.So In my opinion, a NoSQL data store is a perfect match for FHIR.

Conclusion: Containerizing data using FHIR Standards as Resource Types (Collections/Documents) from Data Lakes, HL7 Messaging into a NoSQL Database work as a self-contained, documented, standardized basic building blocks which can be readily utilized by both Research and Business Users to meet their Operational and Research needs with ease.

Saturday, September 9, 2017

Connected Sensor Cars

When I bought my semi-upper end Toyota Prius Prime last month, I was surprised that the Car was build with many sensors that it could do self-parking, alert me when I am wavering into another lane and even more automatically adjust the speed on cruise control when another Vehicle moves into my lane or its sees danger of collision with another slow moving one.

These were all possible because it leveraged different kinds of sensors built into the Car. There are many luxury and top-end cars which has more features than this one. The point I am trying to make is current and future’ cars are likely to have 100’s of different kinds of sensors for Safety, Ease of Use and more truly to be called Connected Cars a phase before Automatic Cars takes on our roads and have a paradigm shift including using Car As a Service (CAaS).

A quarter of 100 million cars produced a year globally, have sensors providing functionality ranging from Personal Health Monitoring ( Impaired Driver Check to Emergency services), Vehicle Care ( Vehicle Remote repairs, Breakdown Mgmt), Digital Marketing (Location based Offers, Couponing), Connected ADAS (Road Sign Detection, Road Condition Warning), Safety features (Stolen Vehicle Tracking) and other services like Concierge, Infotainment and other Customer focused support features.

Connected Cars are leveraged to support Community based sharing of information from curb side parking availability to providing assistance in traffic movement metrics on Speeds and Congestion information on any given road to local authorities. A good example of that is Waze a Google owned Free Navigation support mobile software that has revolutionized GPS navigation with real-time traffic alerts on congestion, obstacles and even Cops nearby during one's drive using this Community Sharing paradigm.

INRIX is another major traffic information aggregator providing travel information in real-time as well traffic patterns on the highways and local roads including intersections 24 X 7 to several private, municipalities and public institutions using its own GPS support navigation systems equipped in millions of Cars as well support from 3rd Parties and Community globally. Navigation features enables it to provide Smart routing, Live traffic information besides providing Traffic prediction.

The one major feature that is radically changing and getting highly visible is with regards to Parking. Last year Waze has teamed up with INRIX to provide a feature called “Where to Park” to the customers enabling them to identify optimal parking spots as they drive off to their destination. Customer can select from any of the Parking Spots including Commercial and updated Curb-side Open spots in near real-time by weighing in the walking distance info to their destination provided by the app.

Last year 2016, Federal US DOT (Department of Transportation) has funded more than $45 millions in 3 major US cities of (NY, Florida, Wyoming) to operationalize cutting edge mobile and roadside technologies to reduce environmental impacts designed to save lives and improve personal mobility. In my opinion with millions of Connected Cars on the road, these vehicles would become the game changer and present a more viable option to address the desired goals of many such initiatives in place of permanent fixtures.

The next generation Autonomous (Automatic Cars) will have much more radical and paradigm shift in how we use travel modes from point A to point B with ease besides opening up the congested roads, infusion of Warehouse parking options, reduction of car ownership, garage space requirements and so forth.

It’s no more a futuristic trend, but real. We are in exciting time period to watch this travel mode unfold before our own eyes opening up opportunities and challenges with its adaptation.