Team Lead Resume
SUMMARY:
- Experienced IT Professional with 12+ yrs experience as an Architect and Developer in Data management, ETL, Data warehousing and Bigdata Ecosystem and working in Banking, Health Care, Insurance, Retail and Telecom domains
- Experienced in Informatica Power Center 6.1, 7.x, 8.x & 9.x,Informatica Designer, Informatica BDM Workflow Manager and Workflow monitor, Informatica Power Exchange, IDQ,PDI (Pentaho data integration),SQL server 2008, Oracle, Netezza 7.x,Teradata BTEQ scripts and well versed with Big data like Teradata, Netezza, Hadoop, HIVE, Hue and Sqoop .Applications on Windows and UNIX / Linux environment.
- Experienced in all phases of SDLC (Requirements gathering, Analysis and Technical Design, modeling and end to end implementation) of Data quality, data mart and data warehousing solution implementations using star schema, snowflake and Data Vault models in Waterfall and Agile environments and have good knowledge of Ralph Kimball (Dimensional) &Bill Inmon (Relational) model Methodologies.
- Hands on experience with Data Extraction, Transformation, Data Loading and Preparation of Technical Specifications and solution blueprints.
- Led Test Strategy and planning and performing various levels of Testing such as Unit testing, System testing, Functional testing, and Pre - Prod ETL testing and prepared Unit Test Specification Requirements.
- Strong knowledge of Hadoop Architecture and Daemons such as HDFS, JOB Tracker, Task Tracker, Name None, Data Node and Map Reduce concepts.
- Implemented Ad-hoc query using Hive to perform analytics on structured data.
- Expertise in writing Hive UDF, Generic UDF's to in corporate complex business logic into Hive Queries.
- Experienced in optimizing Hive queries by tuning configuration parameters
- Involved in designing the data model in Hive for migrating the ETL process into Hadoop and wrote Pig Scripts to load data into Hadoop environment
- Implemented SQOOP for large dataset transfer between Hadoop and RDBMS.
- Worked with Oozie and Zoo-keeper to manage the flow of jobs and coordination in the cluster.
- Good interpersonal skills, commitment, result oriented, hardworking with a quest and zeal to learn new technologies. Mentored, coached, cross-trained junior developers by providing domain knowledge, design advice
- Proficient problem-solver who envisions business and technical perspectives to develop workable solutions. Motivated achiever who guides organizations in applying technology to business settings, provides added value, and creates project deliverables in timely manner.
- Outstanding project and team leader; able to coordinate and direct all phases of project-based efforts while managing, motivating, and guiding teams.
TECHNICAL SKILLS
- ETL Tools: Informatica 6.1, 7.x,8.x,9.x,NetProphet, PDI (Pentaho data integration)
- Database: Oracle
- Big data: Teradata, Netezza, HDFS, Hadoop, Hive, Hue, Sqoop
- Scheduling Tools: Redwood Cronacle, Control - M.
- Programming Languages: C, C++, Shell scripting
- Operating Systems: UNIX, Windows 7 Platforms
PROFESSIONAL EXPERIENCE
Team Lead
ConfidentialEnvironment: HDFS, Hive, Hue, Hadoop, Sqoop, Hortonworks Data Platform, Informatica, Jira. UNIXTWS.
Responsibilities:
- Responsible for providing technical advice and guidance to the management
- Assisted PM in estimating project cost and benefits.
- Defined infrastructure requirements for the project.
- Gathered requirements and write the design specification
- Organized team meetings and prepared weekly/monthly status reports.
- Responsible for Project Data classification, Code Reviews, monitoring the Strom Topology, effort estimations and delivery
- Mentored developers for design, implementation, and difficult tasks.
- Created source to target mapping and job design documents from staging area to Data Warehouse.
- Worked on troubleshooting, performance tuning, performance monitoring and enhancement of mappings.
- Integrated Hadoop into traditional ETL, accelerating the extraction, transformation, and loading of massive structured and unstructured data.
- Wrote SQL queries required to retrieve data for extracts.
- Used HIVE join queries to join multiple tables of a source system and load them into Elastic Search Tables.
- Worked extensively on HIVE, SQOOP
- Written Hive queries for data analysis to meet the business requirements.
- Creating Hive tables and working on them using Hive QL. Importing and exporting data into HDFS from Oracle Database and vice versa using Sqoop.
- Able to identify the cause and solution of the defects identified during the unit/system testing.
Team Lead
Confidential, Bellevue, USA
Environment: Windows 7, Netezza 7.2.0.7, SQL server 2008, NetProphet (ETL Tool), Control-m, SVN,Jira. UNIX, HDFS, Hive, Hue, Hadoop, Sqoop, Hortonworks Data Platform, Pantaho
Responsibilities:
- The engineering team wants to analyze the traffic patterns on TMUS’s Voice, 4G and LTE Data, Text and multimedia network for performance and capacity planning
- Marketing teams want to understand for device types on network for rolling out new campaigns
- Law Enforcement may request Confidential to analyze a particular subscriber usage details
- Responsible for providing technical advice and guidance to the management
- Assisted PM in estimating project cost and benefits.
- Defined infrastructure requirements for the project.
- Gathered requirements and write the design specification
- Organized team meetings and prepared weekly/monthly status reports.
- Responsible for Project Data classification, Code Reviews, monitoring the Strom Topology, effort estimations and delivery
- Mentored developers for design, implementation, and difficult tasks.
- Created source to target mapping and job design documents from the staging area to Data Warehouse.
- Worked on troubleshooting, performance tuning, performance monitoring and enhancement of mappings.
- Performed SQL tuning and Application tuning using EXPLAIN PLAN.
- Wrote SQL queries required to retrieve data for extracts.
- Creation of database objects like tables, views, materialized views.
- Used HIVE join queries to join multiple tables of a source system and load them into Elastic Search Tables.
- Worked extensively on HIVE, SQOOP
- Written Hive queries for data analysis to meet the business requirements.
- Creating Hive tables and working on them using Hive QL. Importing and exporting data into HDFS from Oracle Database and vice versa using Sqoop.
- Able to identify the cause and solution of the defects identified during the unit/system testing.
- Scheduled workflows Using Control-m Scheduler.
Senior System Engineer
Confidential, Retail Stockholm, Sweden
Environment: Windows 7, Teradata, Informatica Power Center 9.1, Jira
Responsibilities:
- Responsible for providing technical advice and guidance to the management
- Assisted PM in estimating project cost and benefits.
- Defined infrastructure requirements for the project.
- Gathered new requirements and write the design specification
- Organized team meetings and prepared weekly/monthly status reports.
- Responsible for Project Data classification, Code Reviews, monitoring the Informatica mappings, effort estimations and delivery
- Developed and designed jobs using Informatica Designer based on mapping specifications using appropriate stages.
- Performed SQL tuning and Application tuning using EXPLAIN PLAN
- Used Teradata utilities fastload, multiload to load data
- Wrote BTEQ scripts to transform data
- Wrote SQL queries required to retrieve data for extracts.
- Creation of database objects like tables, views, materialized views
- Created source to target mapping and job design documents from staging area to Data Warehouse.
- Data Vault Model was used in developing Informatica jobs.
- Able to identify the cause and solution of the defects identified during the unit/system testing.
- Scheduled Workflows Using Workflow Manager.
Senior System Engineer
Confidential
Environment: Linux, Windows NT, Oracle 11g, Informatica Power Center 9.1GSM (Global Sapphire Mart: GSM is a central database which holds data related to Trade Settlements, Front office applications (i.e. order entry), Order Execution, Asset & Price feeds and provides feeds to other applications for regulatory & compliance reporting.
Responsibilities:
- Grasping the functional requirements and understanding the detailed high level design document.
- Gathered new requirements and write the design specification
- Organized team meetings and prepared weekly/monthly status reports.
- Responsible for Project Data classification, Code Reviews, monitoring the Informatica mappings, effort estimations and delivery
- Imported the table definitions into the repository then export the projects, release and package the jobs.
- Developed ETL processes to extract the source data and load it into the enterprise-wide data warehouse after cleansing, transforming and integrating.
- Developed and designed jobs using Informatica Designer based on mapping specifications using appropriate stages.
- Created source to target mapping and job design documents from staging area to Data Warehouse.
- Developed job sequences to execute a set of jobs with restart ability, check points and implemented proper failure actions.
- Able to identify the cause and solution of the defects identified during the unit/system testing.
- Worked on troubleshooting, performance tuning, performance monitoring and enhancement of Informatica jobs.
- Streamlined the process and prepared Target load order which gives the order of tables to load and its dependency in Oracle database
Senior System Engineer
Confidential
Environment: Linux, Windows NT, Oracle 11g, Informatica Power Center 9.1, Informatica Data Replication (IDR), Informatica Data Quality (IDQ), Informatica Data Explorer (IDE)
Responsibilities:
- Grasping the functional requirements and understanding the detailed high level design document.
- Gathered new requirements and write the design specification
- Organized team meetings and prepared weekly/monthly status reports.
- Responsible for Project Data classification, Code Reviews, monitoring the Informatica mappings, effort estimations and delivery
- Imported the table definitions into the repository then export the projects, release and package the jobs.
- Developed ETL processes to extract the source data and load it into the enterprise-wide data warehouse after cleansing, transforming and integrating.
- Developed and designed jobs using Informatica Designer based on mapping specifications using appropriate stages.
- Created source to target mapping and job design documents from staging area to Data Warehouse.
- Developed job sequences to execute a set of jobs with restart ability, check points and implemented proper failure actions.
Senior System Engineer
Confidential
Environment: UNIX, Windows NT, DB2, Informatica 9.1
Responsibilities:
- Implemented business plan based on required technical and business needs.
- Imported the table definitions into the repository then export the projects, release and package the jobs.
- Developed ETL processes to extract the source data and load it into the enterprise-wide data warehouse after cleansing, transforming and integrating.
- Developed and designed jobs using Informatica Designer based on mapping specifications using appropriate stages.
- Star schemas and Snow flake schemas were used in developing Informatica jobs.
- Created source to target mapping and job design documents from staging area to Data Warehouse.
- Developed job sequences to execute a set of jobs with restart ability, check points and implemented proper failure actions.
- Worked on troubleshooting, performance tuning, performance monitoring and enhancement of Informatica jobs.
Senior System Engineer
Confidential
Environment: UNIX, Windows NT, Netezza, Informatica 9.1
Responsibilities:
- Grasping the functional requirements and understanding the detailed high level design document.
- Gathered new requirements and write the design specification
- Organized team meetings and prepared weekly/monthly status reports.
- Responsible for Project Data classification, Code Reviews, monitoring the Informatica mappings, effort estimations and delivery
- Responsible for Extraction, Transformation and Loading of the data using Informatica mappings, ETL Specification reviews and recommended changes, preparing unit test cases
- Able to identify the cause and solution of the defects identified during the unit/system testing
- Streamlined the process and prepared Target load order which gives the order of tables to load and its dependency in Netezza database
Lead Developer
Confidential
Responsibilities:
- Analyzing requirements of ETL logic using ETL Specifications.
- Defining source - Target for the transformations.
- Using source Analyzer and Target designer to Import Source/Target tables from respective databases
- Translating business processes into Informatica mappings
- Loading the Staging tables and EPDS tables as per the business requirements.
