We provide IT Staff Augmentation Services!

Data Analyst/data Modeler/etl Resume

2.00/5 (Submit Your Rating)

SUMMARY

  • Experience in various industries Insurance, Retail Distribution, Procurement, Manufacturing, Pharmaceuticals, Health Care, Inventory Management
  • Experience in working with business users/SMEs (Subject Matter Experts) as well as senior management.
  • Strong Data modeling experience using ER diagram, Dimensional data modeling, Conceptual/Logical/Physical Modeling using3NormalForm (3NF), Star Schema modeling, Snowflake modeling using tools like Erwin.
  • Conducted data modeling review sessions for different user groups, participated in requirement sessions to identify requirement feasibility.
  • Created DDL scripts for implementing Data Modeling changes. Created ERWIN crystal reports in HTML, RTF format depending upon the requirement, Published Data model in model mart, created naming convention files, coordinated with DBAs' to apply the data model changes.
  • Extensively used ERWIN for REVERSE Engineering, FORWARD Engineering, SUBJECT AREA, DOMAIN, Naming Standards Document etc.
  • I have acquired a comprehensive grasp on various technologies like Informatica Power Centre, Business Objects, Oracle PL - SQL, SQL Server 2012, Visual Studio 2010, Dot Net (MVC) etc. My Job duties were to architect, design, model, and implement an enterprise data warehouse. Understand business processes, reporting requirements and underlying data structures. Implement change requests applying GzxP standards. Design, develop and support comprehensive data warehouse systems to balance optimization of data access with batch loading and resource utilization factors, according to customer requirements.
  • I am well versed in Data warehousing methodologies of implementing end-to-end data warehousing solutions. Currently, my primary role is on Data modeling and Solution Architecting. My technical Knowledge is complemented with good communication skill. I have always taken the initiative in meeting project deadlines without compromising on the quality of the deliverables. Self-motivated, excellent written/verbal communication, leadership and team work skills.

TECHNICAL SKILLS

AWS: EC2, IAM, S3, Redshift, CloudFormation, CloudWatch, CloudFront, Code Deploy, Code Pipeline, Code Commit, Cloud9, SNS, RDS, Route 53, Elastic Bean Stalk (EBS), Elastic Load Balancer (ELB), Amazon Cognito

Data Base Systems: Oracle 12c/Oracle 11g/10g/9i/8i/8/7.x, MS SQL Server 6.5/7.0/2000/2012 , DB2 UDB, MS Access 7.0/2000/2007

Office Automation: MS Office

Languages: C, C++, SQL, PL/SQL, VB 6.0, CFML, HTML, Dot Net

Operating Systems: MS-DOS, Windows 95/98/NT/2000 & XP, Unix (SCO, SUN Solaris) Red Hat Linux 8.0

System Design: Visio

Data Modeling Tools: Erwin 9.64/9.5/8.2/7/4.2

ETL & Business Intelligence: Informatica 10.1.1/ 9.6.1/6.2.2 /5.1.2 , Talend, SSIS 2012, Business Objects 5.16/Business Objects 6.5/SSRS/SSAS, Power BI (Desktop, Service and Mobile), Tableau 9 (Desktop, Server, Public, Reader, Online, Mobile)

CRM: Salesforce.com, JD Edwards Procurement

PROFESSIONAL EXPERIENCE

Confidential

Data Analyst/Data Modeler/ETL

Responsibilities:

  • Design, develop and support comprehensive data warehouse systems to balance optimization of data access with batch loading and resource utilization factors, according to customer requirements.
  • Use CA Erwin to design models; publish the models to CA Erwin Web Portal tool to view the metadata information about the tables and lineage between the data elements.
  • Create the conceptual, logical and physical data models.
  • Manage the model mart efficiently to keep all versions of models up to date.
  • Generate DDL scripts and providing them to the database administrators to physically implement on the database server.
  • Develop data cleansing and data profiling techniques
  • Implement forward/reverse engineering techniques and using database normalization.
  • Create metadata repository of the various data elements in the data warehouse and data marts.
  • Create the source-to-target mapping document and end-to-end document.
  • Implement metadata strategy and building metadata repository using Erwin maintaining business rules, data dictionaries, data transformation rules.
  • Work with ETL team by providing required documents and generating business reports.
  • Design and Maintain the Enterprise Logical Data Model (ELDM).
  • Clarify doubts, concerns regarding the proposed solution to be developed.
  • Discuss and prioritize project scope changes and gain agreement.
  • Write complex queries to analyze and validate source data to create the dimensional model.
  • Sharing the developing modules with stakeholders and solicit their feedback.
  • Ensure milestones are met by managing stakeholder’s expectations.
  • Work on data warehouse, database and Operational Data Storage (ODS), and data marts.
  • Resolve conflicts and manage communications with the project team.
  • Provide Data Architecture direction and ensure that the SDLC process is followed and expectations are met.
  • Conduct design walk through sessions with Business Intelligence team to ensure that reporting requirements are met for the business.

Environment: CA Erwin 9.64, Microsoft SQL Server Management Studio 14.0., SQL Azure (12.0.2000.8 )

Confidential

Data Analyst/Data Modeler/ETL

Responsibilities:

  • Re-Architect portal making it enterprise capable and host them on AWS (EC2, RDS, S3, ELB). Remodel the database to provide the ability for multiple operating companies ( Confidential Surgery, Bio Sense Webster Inc.) to use the enterprise application.
  • Requirement Analysis and design for the various Enhancements.
  • Implement enhancements (Oracle Package change) like forecast for almost 110 weeks, improved scorecard calculations, add CPDM document repository for the new source system. Migrate and load history data.
  • Review the application data model and provide valuable insights for all the data related issues and queries.
  • Conduct design walk through sessions with Business Intelligence team to ensure that reporting requirements are met for the business.
  • Provided analysis support for defects and/or incoming incidents or user requests.

Environment: CA Erwin 8.2, Oracle 12c/11g/10g/9i, Informatica 10.1.1/ 9.6.1/6.2.2 , Linux, ASP.net, Java, PLSQL, JDE, Tableau, Toad for Oracle 11.6, Visual Studio 2010, eclipse, git, MS Access 7.0, Tidal, Beyond Compare.

Confidential

ETL Lead

Responsibilities:

  • Designed and Developed a Complex ETL Module consisting of 7 Sub-Modules.
  • Analysis of functional and non-functional categorized data elements for data profiling and mapping from source to target data environment. Developed working documents to support findings and assign specific tasks.
  • Worked with sql queries to scan every single data record from many sources.
  • Performed data analysis and data profiling using complex SQL on various sources.
  • Used Erwin for reverse engineering to connect to existing database and Operational Data Store to create graphical representation in the form of Entity Relationships and elicit more information.
  • Created and maintained Logical Data Model (LDM) for the project. Includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values and business rules in accordance with the Corporate Data Dictionary etc.
  • Worked on ETL to develop source to target data mapping with transformation rules, physical naming standards, data types, domain definitions and corporate Meta data definitions.
  • Used Informatica Variables for storing the values, which are used in various components of Data Flow Task.
  • Modified the existing database packages to meet the changes for the business requirements.
  • Used Informatica transformations such as Lookup, Source Qualifier, filter, Aggregate, Joiner, Script task and Send Mail task etc.
  • Designed, developed and deployed packages.
  • Performed data management projects and fulfilling ad-hoc requests according to user specifications by utilizing data management software programs and tools like MS Access, Excel and SQL.
  • Written SQL scripts to test the mappings and developed traceability matrix of business requirements mapped to test scripts to ensure any change control in requirements leads to test case update.
  • Involved in extensive DATA validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues.
  • Fine tuning of stored procedures to improve the performance.
  • Performed performance tuning on SQL queries, triggers and stored procedures.
  • Worked with end users to gain an understanding of information and core data concepts behind their business.

Environment: CA Erwin 7, Oracle 10g/9i, Informatica 9.6.1/6.2.2 , Linux, PLSQL, Toad for Oracle, MS Access 7.0, CTRL M.

Confidential

Project Lead/Data Modeler/ETL LEAD

Responsibilities:

  • Involved in Loading of Master data in Sales force (Comment and Product data) using Informatica
  • Understanding the existing Argus feed from Newton (PLSQL package) and converting it into technical requirements. Designed the new system using Informatica
  • Creation of a detailed mapping document to understand the mappings between the previous Newton system, Argus and the corresponding Sales force to Argus mapping
  • Drafting a detailed plan for the Timelines and Resource requirements of the project
  • High Level Design creation
  • Creation of detailed Low-Level Designs
  • Reviewing the mappings.
  • Involved in design and review of the Health Effect Section in Sales force. This was to ensure that all the data elements available in Newton are available in Sales Force
  • Created and maintained Database Objects (Tables, Views, Indexes, Partitions, Database triggers etc.).

Environment: CA Erwin 7, Oracle 9i, Informatica 6.2.2, Linux, PLSQL, Salesforce, Toad, Ctrl- M.

Confidential

Onsite Lead - Technical (Data Modeler and ETL Lead)

Responsibilities:

  • Requirement Analysis and design for the various QC items (implementation of ICOMPLY Source system, Modification of Front Office Producer Source, implementation of new rules for Primary agent)
  • Coordination/communication with Compliance Business, Confidential IT and Offshore Team
  • Planning, estimation and execution of tasks like updating LLD, LLD review, coding and code review, UTC updating, UTC review, Unit testing, Integration testing, QA migration and Production Migration
  • Testing for the New QA environment (systest) that was set up as the New QA environment for Compliance
  • Provide Checklist and setting up a process for Disaster Recovery of Compliance Warehouse and Mart
  • Guide and help in the development for the new MF to Annuity feed and disclosed Replacement feed.

Environment: CA Erwin 7, DB2, Informatica 6.2.2/5, Unix.

Confidential

Business Analyst, Data Modeler and ETL Lead

Responsibilities:

  • Understand the design and implementation of previous Source Systems in the Warehouse and Mart, by studying the Data Models, Requirement analysis, Gap analysis and High-Level Design Documents
  • Studied the AWL input Specs and the AWL Product Rollout Guide to understand the input file Layout and the significance of each input element. Analyzed the Product rollout guide to find out the various riders (Plan Codes) available, breakdown of the Premiums as would be available in the input Premium Tracker file, different modes of payment, Transaction types available and Policy related dates (Issue Date, Due Date and the Anniversary Dates)
  • Held multiple meetings with GDC Business Analyst as well as with the existing team to understand the existing system and the changes that would need to be incorporated to load AWL into the existing Warehouse and Data Mart
  • Existing logical Keys for various Warehouse tables like Policy, Coverage, and Coverage Transaction etc. were verified with Awl functionality and changes were made as required
  • A comprehensive Error Handling Strategy was designed to check for required fields, Number fields, date fields and Duplicate records. If the transaction did not pass any of these tests it would be rejected and would not be taken into the system
  • The requirement analysis document, Gap analysis and High-Level Design document for Warehouse was updated for AWL changes
  • Reporting requirements were studied for the Mart, and corresponding changes were made to the Mart Functional specs. Data examples were created for the users to make them understand the data available. The definitions of key reporting requirements like First Year Planned Premium, Second Year planned Premium, Single Premium, Recurring Premium and Excess over Recurring premium were updated based on user inputs for Whole Life Product
  • Planning, estimation and execution of tasks like updating LLD, LLD review, coding and code review, UTC updating, UTC review, Unit testing, Integration testing, QA migration of AWL changes for Sales Datamart
  • Was instrumental in carrying out the changes required for implementing AWL in Mart
  • Instrumental for redesign of Organization and Producer Dimensions for easy and fast reporting
  • Finalize ETL strategy for implementing all the Premium calculations in ETL rather than MicroStrategy
  • Review the Existing data in data mart for correctness. Instrumental in finding discrepancy in VTRD Premium calculations and Policy Count Calculation.

Environment: CA Erwin 7, DB2, Informatica 6.2.2/5, Unix.

Confidential

ETL Lead

Responsibilities:

  • Review the Data model consisting of almost 50 stars to confirm that we have all the data elements at the correct grain. Data profiling using sql queries, to identify data quality issues. Studying the source system and data dictionary, to understand data.
  • Assigning of tasks to team members, updating the daily status to client in the calls, as well as getting queries resolved with the Onsite/Client. Ensuring the timelines are met and guiding team members whenever they encounter problems.
  • Designing the HLD as well as LLD for almost thirty stars which involved almost 50 dimensions and 30 facts.
  • Planning and estimation of tasks to be completed. Coordinate each stage of the Project from Requirement analysis to production Rollout.
  • Setting up a Review process. Provided inputs for the Review checklists as well as setting up best practices while coding (for e.g. creating and using reusable Lookups whenever possible, using uncommitted Reads in the Source qualifier query).
  • Developed a detailed Unit test plan as well as Integration test plan. Exhaustive Unit test cases were prepared as well as an Integration Test Plan (functional as well as technical) for ensuring Data Quality and following process was automated by using scripts.
  • Supporting different environments (QA, UAT, DEV fixes) simultaneously. Quick Analysis and Fast Turnover time for all QA/UAT issues. Planned for the Production rollout. Gave suggestions, reviewed and updated the Runbook for CDM.KT was done for Production support to ensure smooth migration to Production.

Environment: CA Erwin 7, DB2, Informatica 6.2.2/5, Unix.

Confidential

Technical Architect

Responsibilities:

  • Had visited Atlanta for two months for Knowledge transfer and understanding of the existing architecture. The data-warehouse consisting of 85 data marts covering wide range of functional domains like Manufacturing, Finance, Sales and Orders, Travel and Living and Logistics
  • Involved in the design and implementation of upgradation strategy of the existing repositories (dev /QA/Prod) from Informatica 5.1.2 to 6.2.2. Testing of loads before migration to 6.2.2. Provided support for all issues faced after upgradation
  • Establishing best practices across various technologies and making sure that they are followed. Stabilizing Operations processes and suggesting various methods for Improving the Operations for GEPS Data warehouse, for e.g. implementation of BO reporting on ETL metadata for first hand and fast analysis of Loads related data, implementation of Load failure notification script, for immediate notification and taking further actions. Establishing robust error handling mechanisms in all the scripts implemented (file format checking, data validation scripts)
  • Instrumental in establishing a support Central site for operations as well as introduction of new weekly Reporting Format, to capture various Metrics like Mean time between Failures, mean time to recover, % within SLA and outside SLA.

Environment: CA Erwin 4, Oracle, Informatica 5, BO 5/6.5, Unix, PLSQL.

We'd love your feedback!