Scrum Master/etl Lead Resume
St Louis, MO
SUMMARY:
- Wide IT experience Information Technology with standing on Data Inttegration, Data Warehouse/Data Mart development using developing strategies for extraction, transformation, and loading (ETL) in Informatica Power Center from various database sources
- Experience working on ETL Informatica Power Center / Exchange / Developer , SSIS 2008 R2, SQL Server 2008 R2, Qlikview , SYNCSort, Microstrategy .
- Experience working on Databases - OracleExadata/11g , SqlLoader /SqlExecute, DB2 , Teradata - Teradata Parallel Transporter (TPT), FastLoad, MultiLoad, BTEQ and Fast-Export
- Experience in writing script with: - Unix/AWK/SED/Shell/Python/Perl/Autosys.
- Data Modeling and Architecting, Data Integration and design ETL solution, implement dimensional model (Star or Snowflake Schema). Working experience on Inmon’s top-down and Kimball’s bottom-up approach as per business needs and data processing mechanisms like OLAP/ROLAP/HOLAP(Hybrid)
- Develop and Review ETL solutions using Informatica Power Center 9x/10x and SSIS 2008 R2, SQL Server 2008 R2 ETL frameworks.
- Sound experience creating Data Model (Logical and physical).Various modeling techniques.
- Requirement gathering, Gap/Impact Analysis, Preparing FSD/TRD/ETL Design doc, create and maintain RTM (Requirement Traceability Matrix).
- Worked on legacy IBM Mainframes technologies that involved Cobol and DB2
- Extensive experience in various phases in software development life cycle including production support, maintenance and application development in an Onshore/Offshore model using Agile, Waterfall and Iterative methodology.
- Worked on designing tables, Views, Indexes, Stored Procedures, functions, Triggers, common table expressions (CTEs) and Partitioning/Dynamic partitioning in Oracle.
- Proven experience in Leading Mentoring, Team handling, Review, coordination with team about size 20+resources.
- Extensive experience in End to End project execution utilizing Technical and Functional skills and by adopting best practices of the IT industry on the implementation
- Having Beginner Level experience of MDM, HADOOP Ecosystems, MapReduce, Hbase, PIG, HIVE, HDFS, Core Java .
- Worked in Insurance, Banking, Manufacturing and Airline domains.
TECHNICAL SKILLS:
Databases: Oracle 11g/Exadata 12c, MS SQL Server 2013/2012/2008 R2, DB2, VSAM, IDMS
Scripting: SQL, PL/SQL, UNIX Shell/AWK/SED Scripting.
Languages: COBOL/JCL/CICS, C, C++, Java, J2EE, XML, HTML, JavaScript
ETL / Data Analysis / Visualization: Informatica 9.x/10.x Power Center/Exchange/Developer, SQL Developer/Navigator, TOAD, SQLLoader, Microstrategy, T-SQL, OBIEE
Operating Systems: IBM Mainframe (MVS - OS 390), Windows 2008/7/XP, LINUX, UNIX
Other Tools: Erwin, VISIO, CA7/AUTOSYS Scheduler, HP ALM Quality Center, MS Office 2013/07/03, NDM File Transfer, Harvest, Github, UDeploy, Jenkins
EXPERIENCE DETAIL:
Confidential, St Louis, MO
Scrum Master/ETL LeadTools: used are Github, Udeploy, Jenkins, Harvest, SQL Developer, Putty, Winscp, HP Service Center
Responsibilities:
- Analyze TRD/BRD and study the impact on existing jobs/loads.
- Closely work with Data Modeler to gather requirement from DA and finalize the data Model, with a data integration perspective.
- Study Source and Confidential system, and nature of data load/Extract, identify ETL connection requirements, privileges, Access Requirements for source/ Confidential systems to perform the task.
- Architecture the ETL process flow document based on the requirement and create/design ETL solution as per requirement. (highlighting Data lift strategy, Staging requirement if any, Exception Handling, Performance, reusability)
- Design and Create temporary table structure, Data Seed requirement, Error/data rejection strategy.
- Design and create SCD Type-2 and Type-1 load to process delta from Oracle source and Oracle Confidential creating Develop ETL Informatica mappings (Involving Joiner, LookupExpression, Router, ranker, SQ, Filter and Sorter transformations, Update Strategy) to transform data to Target.
- Create/develop UNIX shell, AWK, SED script, SQLLoader control files, and SQL Execute scripts to execute the Informatica PowerCenter workflow.
- Configure Dropbox as per to process new deliveries and capture the data from various sources before moving to and from ODS.
- Create Data Base Scripts (DDL and DML scripts to create table, Alter Indexes, Grant Privileges) to implement the data Model in all the environments.
- Create backout procedure/scripts to rollback and revoke the changes and ensure system stability in case of rollback.
- Create Autosys JIL, Managing and Inserting Job box, dependencies and file watchers.
- Informatica DVO (Data Validation Option) to execute comprehensive validation test.
- Coordinate with the test team to Support the system Integration testing, debug and fix the defects reports in SIT Environment
- Work with Release coordinator on release window and production handover document.
- Used CA Harvest implementation package to implement on higher environments as well as Github / CloudBees Jenkins / IBM UDeploy.
- Perform Change management activities using Pac2000 v7.6 (Remedy S/W)
Confidential, NY, NY
Scrum Master/ETL LeadResponsibilities:
- Work with Data architect and BA through requirement, Develop Logical and Physical data flow models.
- Work Delegation and Lead, Mentor and coordination with Offshore team
- Build ETL approach and process flow, Data lift strategy, Profiling, Staging Exception Handling, Performance, reusability, Incremental Strategy, data Seed strategy for downstream Web services and Business to profile and Data Quality (DQ), Source to Confidential mapping document
- Extract, transform and Integrate Organization/DEBT/RATING from various Databases/schemas (involved Golden Source database) and build Hierarchies de normalizing the core data..
- Create ETL Informatica mappings (Involving Joiner, LKP, Expression, Router, ranker, SQ, Filter and Sorter transformations, Update Strategy) to transform data to Target, Refactor/retrofit iteratively making the ETL and Agile ETL.
- Build delta approach and Unit test the ETL built in development environment before moving UAT. Followed Test cycle and support.
Environment: Informatica Power Center 10x, Informatica Power exchange, Oracle 11g, MS SQL, Unix Shell Scripting (PUTTY/WINSCP), Web Services, XML, VISIO, AUTOSYS Scheduler, HP ALM Quality Center.
Confidential, St Louis, MO
Scrum Master/ETL LeadResponsibilities:
- Following complete SDLC process, Designing Incremental strategy, Reusability, Error handling, Bottleneck analysis and
- Created Complex ETL Mappings (CD, Both Type 1 & 2). to load data using transformations like Source Qualifier, Sorter, Aggregator, Expression, Joiner, Dynamic Lookup, and Connected and unconnected lookups, Filters, Sequence, XML Parser, Router and Update Strategy. Identified the bottlenecks in the sources, targets, mappings, sessions and resolved the problems. Implemented
- Capture Metadata, Capacity Plan and Performance metrics for all the queries being created and Define archival strategy and provide guidance for performance tuning.
- Involved with a Pilot Project to see and analyze data migrations to and from Hadoop (Hive).
- Used Informatica power exchange and Informatica developer (using both methods for comparison) to establish connection to Hadoop Hive connector.
- Moved the data to and from (Source and Target) different nodes from data on files (from Legacy systems).
- Used Hive to add structure to datasets stored in HDFS (Hadoop).
Confidential, Cincinnati, Ohio
Scrum Master/ETL LeadResponsibilities:
- Responsible for onshore and offshore coordination, and assigning and tracking the task with Offshore, resolving issues by coordinating Client/Architects/Business and Team, bridging gaps to ensure delivery as expected per scope
- Develop Physical Data Model for ETL Source (DB2/FF/VSAM/XML) to Confidential mapping and the process flow diagrams for all the business functions.
- Converted the data mart from Logical design to Physical design, defined data types, Constraints, Indexes, generated Schema in the Database, created Automated scripts, defined storage parameters for the objects in the Database.
- Setting Power Exchange connections to database and mainframe files.
- Develop Mappings (using Unconnected Lookup, Sorter, XML Parser, Aggregator, newly changed dynamic Lookup and Router transformations), session and workflows to transfer data from multiple sources to staging and then from staging to data warehouse.
- Working on best approach to get incremental/delta data from source to load into staging area.
- Work with developers, DBAs, and systems support personnel in elevating and automating successful code to production.
Environment: Informatica Power Center 9.6.1/Power Exchange, ERwin Oracle 11g , Flat files, Sequential files, SQL, PL/SQL, SPUFI, Platinum, SQL*Plus, IDM Mainframe, DB2, VSAM, Flat Files, XML, UNIX (WINSCP).
Confidential, Edison, NJ
Scrum Master/ETL LeadResponsibilities:
- Extensively used MS SQl Server Integrator for extracting, transforming, and loading databases from sources including Oracle, DB2, and Flat files. Collaborated with EDW team in, High Level design documents for extract, transform, validate, and load ETL process data dictionaries, Metadata descriptions, file layouts, and flow diagrams.
- Created SSIS packages to transfer data from multiple sources to staging and then from staging to data warehouse
- Developed complex reports using multiple data providers, user defined objects, aggregate aware objects, charts, and synchronized queries with SSRS.
- Created multiple dashboards with QlikView for analytics, which provides drill down, drill up, slice and deicing options to end user on dashboards.
- Scheduling of SSIS Package to execute daily, weekly/ monthly with Dollar Universe ($U).
Environment: SSIS 2008 R2, SQL Server 2008 R2, QlikView
Confidential, Edison, NJ
Scrum Master/ETL Lead
Responsibilities:
- Independently handled the multi-platform and multi-location project (Team size 20).
- Involved with DWH team to create the warehouse to input the MST development.
- Involved with data Modelers to create a Data model/DWH design to trigger the ETL development process.
- Involved in ETL Design (Informatica Power Center) Implementing CDS1 and CDS2, Load Plans, various transactions like Union Derived Column, Aggregator, Lookup, Sorter etc.
- Involved in Dashboard design and architecting visualization using Microstrategy
- Project Management and coordination activities:
- Managing and Coordinating with team at different locations driving to technical consensus. Coordination with various application teams on Citi on knowledge transfer activities and facilitate calls/meetings if required.
- Preparing the Project Plan-MPP, Responsible for project (Sprint)/resource planning.
- Setting up the dependencies and critical path for the project.
- Assigning resources to various tasks and ensuring the timelines according to the plan.
- Project being agile in nature: Rigorous tracking/monitoring of all the tasks, Risks and dependencies, Track Impediment & Action log
Environment: Informatica Power Center 9x, DB2, MPP, Microstrategy, Core Java, Erwin, SQL +,Unix.
Confidential
Scrum Master/ETL LeadEnvironment: Informatica Power Center 8.6, Oracle 10x, IBM Mainframe/Cobol/JCL/DB2, Flat files,, Seibel Apps, File Aid, Syncsort)