Tech Lead (etl/informatica/hadoop) Resume
Tampa, FL
SUMMARY
- Talented ETL Developer and Tester with outstanding experience and demonstrated expertise with high - end projects in deadline-oriented environments
- 10+ Years of Business Intelligence Applications - specialized in the development of Data ware Housing Solutions including design; development and Testing of components.
- Strong experience in ETL Informatica Power Centre with complex mapping design using Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet and Transformations.
- Have experience in Oracle, Teradata, greenplum, PostgreSQL, MySQL databases in building the DataMart and Data ware house applications.
- 2+ Years of Hadoop HIVE and HBASE which Involved in daily reconciliation of data between source and DWH
- 2+ Years of SSIS ETL processing between source and DWH
- Have Experience in informatica BDE
- Experience in other utilities such as SQL*Loader, Teradata FLOAD/MLOAD; Data Loader, Win SCP and Putty
- Have worked on Job Scheduling tools Cornacle ; Autosys ; Oozie
- Have experience in BI reporting Cognos; Business Objects and OBIEE
- Have used SVN for maintaining the version control.
- Strong Experience in Scrum/Agile Methodologies and hands with Jira
- Strong Experience in systems integrations and end to end validations
- Have Strong experience in User Acceptance Testing with Excellent communication skills and strong background working directly with client to identify business objectives and establish requirements
- Have Experience in Quality Centre and Spira test for Test case creation and Defect Tracking
- Have Experience in performance testing using Apache Bench; Load Runner and Jmeter
- Have Experience in Unit test and System Test automation.
- Strong Experience in Onshore/Offshore Development practices and team building
TECHNICAL SKILLS
Tools: & Utilities Informatica (all different version starting v 7 and explored to BDE), Cognos, Business Objects, WIN SCP; Erwin 4.1, BMC Remedy, Compass & Quality Centre; Spira test; Jira
Databases: Oracle,Teradata,My SQL, PostgresSQL, Greenplum
Platforms: Windows 2000/NT/XP/Vista,Win7,Linux/Unix, Sun Solaris
Software: PL/SQL; BTEQ and Hadoop HIVE, HBASE, Shell Scripting; XML
Performance Tools: Apache bench; Load Runner & Jmeter
PROFESSIONAL EXPERIENCE
Confidential, TAMPA, FL
Tech Lead (ETL/Informatica/Hadoop)
Responsibilities:
- Involved in Requirement & analysis ; backlog creation and sprint planning
- Analyzed business, functional, and technical requirements to make sure project met expectations.
- Low level Design and Documentation
- Development of Requirements through Big Data Hadoop and UNIX scripts
- Schedule all the jobs using oozie and quires using Hive
- Design ETL specs and develop ETL mappings
- Unit Testing, Systems integrations and end to end validations
- Performing system and UAT testing.
- Updating Mercury Quality Centre with effective traceability
- Coordinate and Communicate with the Onsite/Offshore
- Performing IQA and EQA
- Working closely with the end users to resolve the issues.
- Code review of peers.
- Coordinating with offshore team (Size 10) members and delegating the work.
- Debugging and fixing the issues.
Environment: UNIX, Microsoft Windows 8.1; Big data Hadoop, Informatica, Mercury Quality Center, Putty; Jira, Load Runner; Apache Bench ; Jmeter; Flume
Confidential, TAMPA, FL
Tech Lead
Responsibilities:
- Involved in Requirement & analysis ; Design Documentation
- Development of Requirements through Postgres and UNIX scripts
- Design ETL specs and develop ETL mappings
- Performing Unit testing and Updating Mercury Quality Centre
- Systems integrations and end to end validations
- Coordinate and Communicate with the Onsite/Offshore
- Performing IQA and EQA
- Working closely with the end users to resolve the issues.
- Code review of peers.
- Coordinating with offshore team (Size 10) members and delegating the work.
- Debugging and fixing the issues
Environment: UNIX, Microsoft Windows 8.1; Informatica, Postgress SQL, Mercury Quality Center, Putty
Confidential
ETL Lead
Responsibilities:
- Analyzed business, functional, and technical requirements to make sure project met expectations.
- Requirement & analysis Low Level Design Documentation
- High Level Design Documentation
- Development of Requirements Through Informatica and batch scripts
- Working on Data dictionary, business glossary documents
- Unit Testing, Systems integrations and end to end validations
- Working on Control-M, scheduling all the activities
- Coordinate and Communicate with the Onsite
- Ensure the overall quality of the project and deliverables
- Training and Mentoring of the team members
- Informatica BDE ( involved in initial analysis and POC )
Environment: Windows XP ; SQL Server, Informatica 8 & Autosys; Informatica BDE; Crystal Reports; SSIS; SSAS & SVN
Confidential
ETL Developer
Responsibilities:
- Design of the business rules and functionalities for validating and automating the reporting process.
- Effectively estimate the cost, schedule and effort for the project.
- High Level Design Documentation
- Developed COGNOS reports which provides more flexible and readable information to the business users with drill down capabilities.
- Development of Customer aggregated metrics computation using Informatica.
- Used most of the Transformations such as Source Qualifier, Aggregator, Lookup, Filter, Sequence generator, Router, Normalizer etc.
- Development of Customer aggregated score card reports using COGNOS.
- Testing TL9000 metric computation for all the development and enhancements made to the system
- Performing Unit testing and System testing
- Coordinate and Communicate with the Onsite
- Responsible for transition and training to User.
Environment: Windows XP and UNIX (Solaris9) Informatica 7.1.4, COGNOS 8; Oracle PL/SQL,Net 2003, UNIX Shell Scripting,SVN
Confidential
ETL Developer
Responsibilities:
- Implemented ETL solution using Informatica.
- Writing BTEQ Scripts and using FLOAD and MLOAD to load data into Warehouse and also used EXPLAIN Plan to check the Query Performance
- Analyze Source System data.
- Worked on Informatica - Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet and Transformations.
- Involved in importing Source/Target Tables from the respective databases.
- Used most of the Transformations such as Source Qualifier, Aggregator, Lookup, Filter, Sequence generator, Router, Normalizer etc.
- Extensively used ETL to load data from Oracle and Flat files to Data Warehouse (TERA DATA).
- Used Informatica Workflow Manager to create Sessions, Workflows.
- Tested the COGNOS cubes & reports against the backend data.
- Performing Unit testing and System testing
- Coordinate and Communicate with the Onsite
- Responsible for transition and training to End User
Environment: Windows XP Teradata BTEQs, UNIX Shell Scripting; Informatica 8.x, COGNOS 8, Cronacle, Tortoise CVS