Summary
Overview
Work History
Education
Skills
Timeline
Generic

Deepa Sachdev

Newark

Summary

Data analysis professional with 9+ years of total IT experience and expertise in data modeling for data warehouse/data mart development, Data Analysis, SQL and developing Conceptual, logical models and physical database design for Online Transactional processing (OLTP) and Online Analytical Processing (OLAP) systems. Hands-on experience with MMIS, ACA, HIX, HIPAA, and HL7 processes. Worked closely with EDI 820 Payment Order/Remittance Advice and EDI 834 Benefit Enrollment and Maintenance. Implement client EDI 834, 835, 837 Standard and Data mapping EDI files. Experience in using SSIS tools like Import and Export Wizard, Package Installation, and SSIS Package Designer. Excellent knowledge of complete data warehouse life cycle, testing methodologies, OLAP and OLTP. Well versed in Normalization and DeNormalization techniques for optimum performance in relational and dimensional database environments and have performed normalization up to 3NF. Worked on one complete ERP implementation project. Hands on experience in using MapReduce programming model for Batch processing of data stored in HDFS. Experience in Converting multiple reports from Excel, Tableau and BO to Spotfire. Experienced in finding trends from the data using predictive modelling techniques such as Linear Regression, K-Means clustering, PCA, Decision Trees, Random Forests, KNN and Logistic Regression. Implemented Machine-Learning to predict the future trends with p-value and this feature had huge impact on the finding the result in this project. Made predictions based on the data patterns and calculated ROC. Data analysis techniques to validate business rules and identify low quality missing data in the existing data. Experienced with most of software deployments methodologies(SDLC) like Water Fall, Iteration, Agile and Rapid Application Development (RAD). Experienced in Business requirements conformation, data analysis, data modeling, logical and physical database design and implementation. Familiarity with Crystal Reports and SSRS - Query, Reporting, Analysis and Enterprise Information Management, MongoDB. Experienced in Normalization and Demoralization processes, Logical and Physical data modeling techniques. Installed and configured Cassandra. In depth knowledge about Cassandra architecture, query, read and write path. Experienced in Integrating high – level business rules (constraints, triggers and indexes) with the code. Proven knowledge in capturing data lineage, table and column data definitions, valid values and others necessary information in the data models. Strong Experience in ER & Dimensional Data Modeling to deliver Normalized & STAR/SNOWFLAKE schemas using Erwin r7.2 Erwin 9.6/8.2/7.0, Power Designer 15, Embarcadero E-R Studio and Microsoft Visio and ER Studio 9.1.x. Experience in Data Scrubbing/Cleansing, Data Quality, Data Mapping, Data Profiling, Data Validation in ETL. Experience in designing star schema, Snowflake schema for Data Warehouse, ODS architecture. Excellent communicative, interpersonal, intuitive, analysis, leadership skills, quick starter with ability to master and apply new concepts.

Overview

12
12
years of professional experience

Work History

Sr. Data Modeler/Data Analyst

Capital Group
02.2017 - Current
  • Involved in requirement gathering put together with business analysts group and identifying KPIs
  • Gathered all the Analysis report prototypes from the business analysts of Business units
  • Gathered various requirement matrices from the business analysts
  • Worked on MongoDB data modeling on balancing the needs of the application, the performance characteristics of the database engine, and the data retrieval patterns
  • Used MongoDB to change the structure of the documents in a collection, such as add new fields, remove existing fields, or change the field values to a new type, update the documents to the new structure
  • Documented the performance within the system by learning the prevailing data model
  • Mapped company enterprise requirements and new databases to logical data model which defines the project delivery needs
  • Maintained spreadsheets to collect, track, prepare, compile, and distribute statistical data for daily, weekly and monthly reports using Amisys
  • Participated in Joint Application Development(JAD) sessions
  • Conducted Design discussions and meetings to agree on the appropriate Data Model
  • Designed complex data model using database tool Erwin r7.0
  • Used ER Studio to make logical and physical data models for enterprise wide OLAP system
  • Developed Star and Snowflake schemas based dimensional model growing the data warehouse
  • Modeled the dimensions and facts using Erwin for centralized data warehouse
  • Identified and tracked slowly changing dimensions and determined the hierarchies
  • Used Teradata utilities fastload, multiload, pump to load data
  • Responsible for managing life style of department including compliance initiatives, group practice agreements
  • Converted multiple reports from Excel, Tableau and BO to Spotfire
  • Transformed data from one server to other servers using tools like Bulk Copy Program (BCP), and SQL Server Integration Services (SSIS) (2005/2008)
  • Expertise in Creating Report Models for building Ad-hoc Reports Using SSRS
  • Expertise in Generating Reports using SSRS and Excel Spreadsheet
  • Modeled and populated the business rules using mappings into the Repository for Meta Data management
  • Worked in regard to the conversion way of data that's stored in flat files into Oracle tables
  • Experience in utilizing SSRS and Cognos in creating and managing reports for an organization
  • Involved in designing, developing and deploying reports in MS SQL Server environment using SSRS-2008
  • Implementation of Business Rules in the Database using Constraints & Triggers
  • Used data analysis techniques to validate business rules and identify low quality missing data in the existing data
  • Created, tested and implemented Teradata Fastload, Multiload and Bteq scripts, DML and DDL
  • Actively taken part in Statistical analysis and data mapping activities with the Data warehouse
  • Created summary tables using de-normalization technique to improve complex Join operations
  • Generated comprehensive analytical reports by running SQL queries against current databases to conduct data analysis linked to Loan products
  • Participated in the tasks of knowledge migration from legacy to new database system
  • Worked on Metadata transfer among various proprietary systems using XML
  • Conducted Design reviews with business analysts, content developers and DBAs
  • Designed and implemented traditional data models to manage marketing strategies along with satisfy reporting needs dynamically
  • Organized User Acceptance Testing (UAT), conducted presentations and provided support for Business users to get familiarized with banking applications
  • Handled performance requirements for databases in OLTP and OLAP models
  • Worked together with the Implementation team to ensure a smooth transition within the design to implementation phase
  • Performed Data Loading and Transformation using Data Junction 6.0
  • Environment: Erwin r9.1, ER Studio, SQL Server 2005, SSIS, MicroStrategy, SSRS, Windows XP/NT, Oracle 11i, MS-DTS, UML, SQL Loader, Data Junction, XML files, Rational Rose 2000, MongoDB, Oracle JD Edwards EnterpriseOne 8.12, TERADATA.

Sr. Data Modeler/Data Analyst

AmeriSource Bergen
08.2014 - 12.2016
  • Studied the Requirements Specifications, use cases and analyzed the data needs of the Business users
  • Converted the Logical data models to Physical data models to generate DDL
  • Extensively worked with Teradata database as part of the enterprise warehouse development and the data mart, staging and prestaging environment build outs
  • Updated the Naming and version control standard documents and implemented version controlling using the Model Manager
  • Migrated several models in ERWin 4.1/7.1 into ERwin7.2 and updated the naming standards
  • Created complex mappings and mapplets using Lookup, Expression, Aggregator, Sequence Generator, Union, Normalizer, and Router transformations
  • Involved in writing the PL/SQL validation scripts to identify the data inconsistencies in the sources
  • Worked with Business Analysts to design weekly reports using Cognos
  • Researched rejections using EDI tools, looking at raw data and submission using Amisys
  • Developed and maintained in-depth knowledge of the Amisys data from technical and user perspective
  • Created Design documents, Source Target Mappings and Sign-off documents
  • Worked on the Enterprise Meta data repositories for updating the metadata and also involved in Master Data Management [MDM]
  • Built the Transformation Rules engine for use of all the designers across the project
  • Documented the designs so as to facilitate the personnel to understand the process and incorporate the changes as and when necessary
  • Responsible for detailed verification, validation and review of the design specifications
  • Conducted review walk through of data models involving SME, developers, testers and analysts
  • Environment: ERwin 7.2/7.1/4.1, Informatica 8.0, Cognos, Teradata, Oracle 9i, SQL Server 2003, SQL, PL/SQL, MS Office, Windows 2003.

Data Modeler

HCSC
02.2013 - 05.2014
  • Participated in requirement gathering session, JAD sessions with users, Subject Matter experts, Architects and BAs
  • Participated in Sprint Testing and documented the Project Change Control and the Impact analysis, worked with BA’s on GAP Analysis
  • Worked with System Architects in providing solutions to the system design, reducing the number of screen selections, internal and external application workflow design and in design of user interface screens
  • Reviewed clients network das network data requirements for design, configuration of routers, switches, OSPF protocol and MPLS spanning over 25 network technical projects for Duke NERC CIP
  • Provided solutions to performance issues in the databases include re-indexing, recompiling stored Procedures and other maintenance tasks
  • Applied Normalization techniques and created Logical and Physical models based on the requirements
  • Conducted and participated in Database design review meetings
  • Prepared Enterprise Naming Standard files and project specific naming standard files in some exception cases
  • Worked with Enterprise Architect team in developing Enterprise Data Models, which is used by most of the applications
  • Designing the SSAS (SQL Server Analysis Services) Cubes; SSRS (SQL Server Reporting Services) Reports, Adhoc Querying facilities
  • Worked with MDM team in making changes to database to accommodate requirements and in Capturing the demographic information from different applications by providing source application data dictionaries, identifying the demographic information and by providing DDL
  • Involved in the redesigning of Legacy systems, Modifications, Enhancements and Break Fixes to existing systems and in integration of one system with the other
  • Worked on Forward and Reverse Engineering using Erwin, reverse engineered XSD structures, excel spread sheets and copybooks
  • Worked on Comparing different models, different versions of models using complete compare in Erwin and compared Databases directly and produced alter scripts
  • Worked on upgrading Erwin Data modeler from v7.3 to v9.5, served as Erwin Repository administrator
  • Experience with Version Control tools
  • Experience in Developing Stored Procedures, Functions and Triggers using T-SQL
  • Worked with DBA’s in migrating data from one database environment to another environment
  • Worked with ETL teams in designing Load jobs and used Informatica Designer, Workflow Manager and Repository Manager to create source and target definition, table load order documents, design mappings, create repositories and establish users, groups and their privileges
  • Analyzed Data Specification Documents (DSDs) and implemented their logic in the form of Informatica mapping design
  • Worked with BI team in providing SQL queries, Data Dictionaries and mapping documents (Report attributes to Database columns)
  • Acted as a Strong Data Analyst analyzing the data from low level in conversion projects, provided mapping documents between Legacy, Production and User Interface systems
  • Extensively performed Data Profiling, Data Cleansing, De-duplicating the data and has a good knowledge on best practices
  • Involved in high complex research, quantitative analysis, modeling and monitoring of data necessary to produce the logical design for review
  • Developed and initiated more efficient data collection procedures
  • Worked with Architects in designing conceptual model for the Data warehouse and Identified Facts and Dimensions, designed Logical and Physical data models
  • Designed different type of STAR schemas like detailed data marts and Plan data marts, Monthly
  • Summary data marts using Erwin with various Dimensions Like Time, Services, Customers and various FACT Tables
  • Worked on Converting Physical only models to Logical and Physical models
  • Environment: Oracle 10g/11g/11c/12c, SQL Server 2005/2008/2012, SSRS, Erwin 7.3/9.5, IBM DB2, CaseComplete, Tableau, Blue Zone, Toad SQL Developer, Informatica Analyst 9.6.1 Sybase Power Designer 15, MSOFFICE 2015, Crystal Reports, ALTOVA XML Spy.

Data Modeler/Data Analyst

Pearson
03.2012 - 01.2013
  • Gathering business requirements by organizing and managing meetings with business stakeholders, Application architects, Technical architects and IT analysts on a scheduled basis
  • Analyzed the business requirements by dividing them into subject areas and understood the data flow
  • Created conceptual and logical models, logical entities and defined attributes, and relationships between the various data objects
  • Updated existing models to integrate new functionality into an existing application
  • Conducted one-on-one sessions with business users to gather data warehouse requirements
  • Developed normalized Logical and Physical database models to design OLTP system
  • Created dimensional model for the reporting system by identifying required dimensions and facts using Erwin r9
  • Worked on multiple implementations on Corporate sponsored and variable universal life insurance section of their web based application
  • Created DDL scripts for implementing Data Modeling changes
  • Created ERWIN reports in HTML, RTF format depending upon the requirement, Published Data model in model mart, created naming convention files, co-coordinated with DBAs’ to apply the data model changes
  • Used forward engineering to create a Physical Data Model with DDL that best suits the requirements from the Logical Data Model
  • Maintaining and implementing Data Models for Enterprise Data Warehouse using ERWINr9
  • Create and maintain Metadata, including table, column definitions
  • Synched up the models by reverse engineering, compare model and merge model from database to the original models
  • Responsible for defining the naming standards for data warehouse
  • Used Model Mart of Erwin r9 for effective model management of sharing, dividing and reusing model information and design for productivity improvement
  • Used Erwin for reverse engineering to connect to existing database and ODS to create graphical representation in the form of Entity Relationships and elicit more information
  • Developed Data Migration and Cleansing rules for the Integration Architecture (OLTP, ODS, DW)
  • Verified the correct authoritative sources were being used and the extract, transform and load (ETL) routines would not compromise the integrity of the source data
  • Tuned the SQL Queries by breaking down the predefined queries into smaller queries for processing, Pushed Query processing up from the client to application server level
  • Gathering Business requirements by organizing and managing meetings with business stake holders, Application architects, Technical architects and IT analysts on a scheduled basis
  • Possess strong Documentation skills and knowledge sharing among Team, conducted data modeling review sessions for different user groups, participated in sessions to identify requirement feasibility
  • Environment: Erwin r9, SQL Server 2008, Oracle 11g, Informix 16, MS Excel, MS Visio2010, Requisite Pro 7.1, TOAD, SQL, PL/SQL, Windows.

Education

Bachelor of Technology - Computer Science

Skills

Data modeling languages

  • Erwin r71/72
  • Erwin r82
  • Erwin r91/95/96
  • Embarcadero ER/Studio
  • Oracle Designer
  • Sybase
  • Power Designer 15
  • Power Designer 166

OLAP Tools

  • Microsoft Analysis Services
  • Tableau
  • Business
  • Crystal Reports9

Programming Languages

  • SQL
  • T-SQL
  • PL/SQL
  • Base SAS
  • HTML
  • XLM
  • VBNET
  • C
  • C
  • UNIX
  • Shell Scripting
  • Python

DataBase Tools

  • Microsoft SQL Server 2000/2005/2008/2012/2014
  • MySQL
  • Oracle 11g/ 10g / 9i / 8i
  • DB2
  • MS Access 2000 and Teradata V2R61
  • Cassandra
  • MongoDB

Packages

  • Microsoft Office Suite
  • Microsoft Project 2010
  • SAP
  • Microsoft Vision

ETL Tools

  • Informatica 71/62
  • Data junction
  • Ab-Initio
  • DataStage
  • SSIS
  • BODS

Other Tools

  • SAS Enterprise Guide
  • SAP ECC and
  • Panorama
  • Web Service
  • FACETS
  • Spark
  • Hive

Timeline

Sr. Data Modeler/Data Analyst

Capital Group
02.2017 - Current

Sr. Data Modeler/Data Analyst

AmeriSource Bergen
08.2014 - 12.2016

Data Modeler

HCSC
02.2013 - 05.2014

Data Modeler/Data Analyst

Pearson
03.2012 - 01.2013

Bachelor of Technology - Computer Science

Deepa Sachdev