We provide IT Staff Augmentation Services!

Senior Bi Solution & Data Architect/data Modeler/etl Developer Resume

4.00/5 (Submit Your Rating)

Pittsburgh, PA

SUMMARY

  • Around 12 years of experience working on large and complex Data Warehousing, Data integration and Data migration.
  • In terms of technologies have used extensively used Informatica Power Center, Informatica Cloud Services, Teradata and its loader utilities MLOAD, FLOAD, FAST EXPORT and TPUMP, Oracle SQL, PLSQL and Unix Shell Scripting. Have sound knowledge of Data Warehousing Concepts and SDLC.
  • Have very good experience on doing performance tuning of the ETL jobs both at the Informatica ETL level and Database level.
  • Worked for Data Warehousing Requirements Gathering, Analysis, Design, ETL Development, Implementation, Integration, Testing, Profiling, Cleansing and Validation of data for Life Science, Pharma and Health Care.
  • Expertise in designing of the Data Integration, Data Migration and Data warehousing projects. Have implemented couple of projects from scratch.
  • Proficient in Designing the Automation Process of Workflows and configuring/scheduling the Workflows for load frequencies. Skilled to Develop, Test, Tune and Debug Mappings, Sessions and Monitor the system. Have experience with job scheduling tools like control - M, Autosys.
  • Experience with ETL Lead:
  • Evaluate business requirements to come up with effort estimates.
  • Coordinating with Offshore team on the requirement discussion, build and test status and resource allocation.
  • Involve in the code review and Unit Testing Documentation.
  • Training the new team members on the project functional area.
  • Maintaining the coding standards.
  • Experience as a solution architecture played the below role:
  • Building ETL framework to extract the data from the various kind of source like XML, SAP, SIEBEL, Salesforce during data integration.
  • Providing proof of Concept for using new technologies like AWS S2 and RedShift. Suggesting the data loaders to use for the data movement into the Dataware house.
  • Have overall knowledge of the Data Warehouse to support any new reporting requirement.
  • Always ready to learn and deliver on the new technologies and take up new task.

TECHNICAL SKILLS

  • Windows Server 2014.
  • MS/SQLServer 2014 
  • SQL Server Data Tools for Visual Studio 2013
  • SSIS.
  • Tableau
  • PeopleSoft
  • Great Plains
  • Microsoft Dynamics AX
  • Oracle EBS R12 Financials
  • Hyperion Planning
  • Power BI Pro.

PROFESSIONAL EXPERIENCE

Confidential, Pittsburgh,PA

Senior BI Solution & Data Architect/Data Modeler/ETL Developer

Responsibilities:

  • Designed, developed, and managed project plans in a complex dynamic environment, revised needs to meet changing requirements.
  • Responsible for developing business intelligence solutions using the Tableau Suite of products from Tableau Desktop to Tableau Server.
  • Worked with team members and business leaders to identify opportunities to turn data analytics into insights, for intelligent decision-making. Helped Financial Analysts develop stories about the data using Tableau Data Visualization and Workbook concept.
  • Designed, maintained and managed Tableau driven dashboards
  • Performed data profiling with Informatica IDQ and Oracle database.
  • Report life cycle: Business requirement gathering, data analysis, query development, report
  • Participated in the definition, development and implementation of data warehouse and data mart databases and their content.
  • Implemented and supported ETL processes and practices in a high availability data warehousing and data mart environments.
  • Planned and coordinated ETL and database changes with project team members, development and database engineers.
  • Designed, developed, tested, optimized and deployed ETL solutions which consisted of batch load packages, workflow processes, stored procedures, related functions and various scripting tasks
  • Worked with the Infrastructure team to help develop and uphold data governance policies and procedures to ensure standardized data naming, establish consistent data definitions and monitor overall data quality for assigned data entities
  • Developed and supported creation and management of KPIs Dashboards / Financial Analytical Reporting Solutions.
  • Gather and documented business requirements for new sources of data and data warehouse enhancements. Translated requirements, source mapping documents and models into solutions. Responsible for applying Master Data Services, Data Quality (MDS, DTQ) strategies, requirements and policies aligned to the company’s goals for data quality, data security and data integration.
  • Interfaced with Network Engineer, Windows Systems Engineer, MS/SQLServer DBA and Business users.
  • Designed New Analysis Services and Tableau Cubes.
  • Operating Systems and Tools: Windows Server 2014. Database Servers: MS/SQLServer 2014 Enterprise Edition, SQL Server Data Tools for Visual Studio 2013. Analysis Services Enterprise Edition, SSIS. Tableau, PeopleSoft, Great Plains, Microsoft Dynamics AX, Oracle EBS R12 Financials and Hyperion Planning, Power BI Pro.

Confidential, Pittsburgh,PA

Senior Solution & Data Architect/Data Modeler/ETL Developer

Responsibilities:

  • Developed ETL procedures to automate implementation of Reference Data for both the OLTP side and the OLAP side.
  • Designed and implemented the ETL mapping using Informatica PowerCenter. Extracted data from various sources and performed transforms in the DEV environment.
  • Produced Entity Relationship Diagrams (ERDs, STAR Schemas) for all systems Using Erwin Data modeling tool.
  • Data Analysis, Business analysis of the market and sales of the biological products and devices using Excel and SQL Server
  • Responsible for applying Master Data Management (MDM) strategies, requirements and policies aligned to the company’s goals for data quality, data security and data integration.
  • Prepares deliverables such as source to target mappings and transformation rules documentation.
  • Designed New Analysis Services Cubes.
  • Developed canned SSRS Reports for accessing The SSAS Cubes as well as the underlining DW tables
  • MS BI platform.
  • Interfaced with Network Engineer, Windows Systems Engineer, SQLServer DBA and Business users.
  • Made extensive use of Visual Studio Data Tools, Team Foundation Server (TFS).
  • Made extensive use of Microsoft SQL Server Integration Services, TSQL.
  • Develops interactive visualizations with Business Intelligence platforms such as Qlikview and Tableau.
  • Participated in Database Development review sessions and Database Administration review sessions.

Confidential, TN

IT ETL Informatica Data Analyst

Responsibilities:

  • Developed the Logical and Physical data models and SQL for data analysis for the development of a Teradata Clinical Data Warehouse with special effort around the Claims processing, FACETS, and clinical entities.
  • Used Informatica data quality for data profiling for extensive data profiling to determine natural keys and data anomalies in the source data.
  • Load customer information from different data sources through Informatica power center, create mapping.
  • Transformed through router, aggregate, lookup, joiner and stored procedure Followed the Enterprise naming standards in creating naming conventions in Erwin modeling tool.
  • Played the role of a Program Technical Lead/ETL Architect and was responsible for designing the ETL Strategy & Architecture of the Project.
  • Responsible in managing tasks and deadlines for the ETL teams Developed Reference and Master Data subject areas first then Member, Provider, Client, Eligibility, Claims (FACETS), Finance, and Product data models.
  • Shell commands for converting existing Informatica jobs Used data analysis to identify data quality gaps in newly developed model.
  • Developed data models for Episode Treatment Groups, Episode Risk Groups, and Evidence Based Medicine.
  • Developed data models for identification and stratification of members with care gaps.
  • Developed data models for the Patient Centered Medical Homes, an approach for quality centered provider performance.
  • Utilized national Industry model (ADRM), and HL7/HITSP healthcare standard attributes and message codes to create a standardized model for the integration of 38 Provider Group partners and 44 national Lab medical record data.
  • The ADRM model is a classic implementation of Parties, and a system generated Party ID.
  • The Party ID is based on having an accurate Master Person Identifier process to put together disparate business keys that is Patient card number and Provider NPI.

Confidential, Indianapolis,IN

ETL Informatica Developer / Project Lead

Responsibilities:

  • Involved in Mapping analysis, Data Profiling, Design, Develop, Unit Testing, System Testing, UAT coordination, Implementation Support.
  • To design/develop mappings to load the data extracts in staging layers using Informatica tool.
  • Worked extensively using Teradata BTEQ, Informatica and UNIX shell scripting.
  • Involved in reviewing the deliverables and ensuring quality.
  • Provided solutions, approach and coding for critical items.
  • Co-ordination with Source systems and Business Analyst to fix Mapping issues, data quality and source related issues.
  • Worked with Data Architect for database model changes.
  • Worked with Teradata DBA team to build Database components, resolve access and performance issues.
  • Responsible for ensuring the quality, coding standards and creating required documentations.
  • Responsible in doing production executions, post production validations, giving LO Handover and getting signoff.
  • Collaboration with multiple stakeholders including Client, Business Analyst, Source systems, Release Team, Testing Team and support teams.
  • Deriving the deliverables, Risks/issues and reporting to higher up management and keeping track of the project status as well.

Confidential, Malvern,PA

ETL Informatica Developer

Responsibilities:

  • Involved in gathering Requirements, Translating and preparing Functional Specifications document and Analyzing requirements
  • Prepared Design documents and Mapping documents.
  • Developed mappings using Informatica to load the data from different file system to staging area and from Staging Area to Warehouse targets.
  • Created mappings using various transformations like Aggregator, Joiner, Filter, Expression, Router, Look up and Update Strategy.
  • Testing the mapping according to Unit Test Cases.
  • Integration testing and system testing at workflow level on each of the servers.
  • Tracking issues and interacting with client for issue resolutions.
  • Designed and developed the ETL framework for Identifying and processing the delta between different sources of data for implementing the SCD Type1 and Type2.
  • Responsible for data Modeling and developing various database objects for integration with different source system.

Confidential

Software Engineer

Responsibilities:

  • Designing the ETL processes to integrate the data between source and Target system.
  • Responsible for optimization of poorly performing database system and Performance Tuning for optimum performance
  • Involved in Unit Test, Integration Test, User Acceptance Test (UAT)
  • Designed and Developed the ETL Architecture for integrating the data from the different source system to BI Data Warehouse.
  • Determined project team and their roles and responsibilities, Create staffing management plan, training needs
  • Handling project deliverables & provided project status reports needed by management and customer.
  • Analyzed production defects to identify data issues (if any).
  • Developing, rectifying and troubleshooting the existing discrepancies in the application using VB6, SQL Server 2000.
  • Responsible for Extracting, Transforming and Loading (ETL) data from Excel, Flat file to MS SQL Server by using Bulk Insert and DTS Packages.
  • Writing & Tuning SQL Queries, Stored Procedures, Triggers, Views, Indexes, Performance Tuning, Query Optimization using SQL Server 2000.
  • Creating Cross Tab, Sub Report, Parameterized reports as per the requirement.

We'd love your feedback!