We provide IT Staff Augmentation Services!

Team Lead/etl Developer Resume

5.00/5 (Submit Your Rating)

SUMMARY:

  • Around8years ofBusiness Discovery, Data Discovery & Analysis, Data Modeling, ETL Design, Development, Testing, ImplementationandTroubleshootingin the field of Data warehousing and Application Development.
  • 8Years of strong Datawarehousing experience specializing inRDBMS, ETL Concepts, Strategy, Design & Architecting, Informatica Administration, Development & Testing.
  • An expert in theETL Tool Informaticawhich includes components likePowerCenter, Designer, Workflow Manager, Workflow Monitor, Repository Manager, Repository Server Administration Console, PowerExchange CDC, Informatica Intelligent cloud Services.
  • Considerable knowledge in Informatica Intelligent Cloud Services includes component like Data Integration, Administrator, Application Integration and Monitor.
  • Demonstrated expertise in AWS technologies like RDS, S3, EC2, Redshift and EBS.
  • Persuasive communicator with exceptional relationship management skills with ability to relate to people at any level of business and management.
  • Quick learner with excellent oral and written communications skills and interpersonal skills.
  • Performed well under pressure, expert in meeting targets, have can - do attitude, result driven, good at multi-tasking, and highly customer focused.
  • Good experience inUnix Shell ScriptingandETL Process Automationusing Shell Programming and Informatica. Also have had quite good experience in Performance Tuning at both Database Level and Informatica.

TECHNICAL EXPERIENCE:

Data Warehousing Technologies: Extract Transform Load & Reporting

ETL Tools: Informatica Power Center 10.x/ 9.x/8.6, Informatica Intelligent Cloud Services, SQL Server Integration Services

Reporting Tools: Power BI, SQL Server Analysis ServicesMS Excel

Cloud Platform: Azure, AWS- Redshift, RDS, EC2, S3

Scripting Language: Unix Shell Scripting, Python

Databases: Oracle, SQL Server, SQL Azure

Operating System: MS-DOS, Linux,UNIX,Windows

PROFESSIONAL EXPERIENCE:

Confidential

Team Lead/ETL Developer

Platform: Windows, Informatica 10.x, UNIX, IICS, SSIS, AWS- Redshift

Tools: /Technologies:Informatica, SQL Server, Unix Scripting, AWS- EBS, S3, Amazon RDS

Responsibilities:

  • Implementing the infrastructure necessary to collect and correlate in support of other initiatives include cold and hot wheel bearing detector analytics.
  • Conducted research to collect and assemble data for databases - Was responsible for design/development of relational databases for collecting data with respect to Railroad Domain.
  • Maintained the data integrity during extraction, manipulation, processing, analysis and storage. Built data input and designed data collection screens - Managed database design and maintenance.
  • Setting up data ingestion, event parsing from various source systems for creation of Data Lake.
  • Functioned as a single point of contact for business requirement analysis and technical query.
  • Deploying Informatica PowerCenter on AWS-Red Hat Enterprise Linux (RHEL) and used Amazon EBS to provide local persistent storage for PowerCenter
  • Migration activities using Cloud Service (AWS) such as Amazon Redshift, Amazon Relational Database Service (Amazon RDS), or Amazon Simple Storage Service (Amazon S3).

Confidential

Team Lead, Data Analyst, ETL Developer

Responsibilities:

  • Focal point for making sound decisions related to data collection, data analysis, data security, methodologies and designs.
  • Conducted research to collect and assemble data for databases - Was responsible for design/development of relational databases for collecting data with respect to Railroad Domain.
  • Maintained the data integrity during extraction, manipulation, processing, analysis and storage. Built data input and designed data collection screens - Managed database design and maintenance.
  • Discussed intelligence and information requirements with internal and external personnel.

Confidential

Module Lead, Big Data Hadoop, ETL Developer

Platform: Windows, Informatica 9.6 +, UNIX, Oracle 11g

Tools: /Technologies:Big Data Hadoop, Hive, Sqoop, Informatica, Oracle, Unix Scripting

Responsibilities:

  • Analyzing different source system and designing suitable archival methodology depending upon the source structure.
  • Completely owned 3 source systems and archived the data into Hadoop platform without any defects. Performed tuning and enhancements to improve the time and performance of loads.
  • Automate the manual processes to improve and save time and effort.
  • Created audit utility in Java which saves money and time for AAA. To get access to metadata of Informatica AAA has to pay millions of money just to achieve audit process. So, created java code which fetches job’s log file and fetched all required details for audit process.
  • Understanding downstream requirement for multiple source systems from Policy data mart and working on fixing historical data and performance enhancements to run daily batch in production.
  • Working closely with client and onsite people on requirement gathering, scope and exit criteria for the module.
  • Taking responsibilities for the data quality and data load’s especially on sensitive environments like SIT and PERF.

Confidential

ETL Developer

Responsibilities:

  • Requirement gathering for the enhancement part, which we are going to implement in RT - CXO module for the existing code written for Home Owners in previous module by us.
  • Working closely with client and onsite people on requirement gathering, scope and exit criteria for the module.
  • Taking responsibilities for the data quality and data load’s especially on sensitive environments like SIT and PERF.
  • Working with two different teams outside organization who are also involve in the process and completing task with the given timelines.

We'd love your feedback!