We provide IT Staff Augmentation Services!

Etl Developer Resume


  • Dynamic professional wif 8+ Years of experience in providing Business Intelligence solutions in Data Warehousing for Decision Support Systems.
  • Experience in teh Analysis, Design, support and Development of Data warehousing solutions and in developing strategies for Extraction, Transformation and Loading (ETL) mechanism using Etl tool along wif Big data Hadoop.
  • Knowledge of full life cycle development for building a data warehouse.
  • Excellent programming skills wif ability to automate routine tasks using shell scripting, autosys/Maestro/Informatica/Crontab scheduler/Big data and a good experience in Confidential SQL and Confidential
  • Worked on most of teh components about 3 years in teh Power center of Informatica for supporting, creating, executing, testing and maintaining Mapping in Informatica and also experience wif Ab Initio Confidential - operating System in application tuning and debugging strategies.
  • Experience in integration of various data sources wif Multiple Relational Databases like Confidential and worked on integrating data from flat files, Confidential tables.
  • Exposure to Multifile systems.
  • Involved in performance tuning of SQL queries by generating explain plan and checking view point
  • Knowledge on using other ETL tools like Informatica/Ab initio/Datastage and reporting tools like Cognos
  • Highly motivated, employee focused professional wif extensive experience in supporting Coding, deploying, monitoring, Audits and Documentations .
  • Highly creative and self-motivated wif innovative and TEMPeffective ideas and concepts for improving efficiency.
  • Comfortable interacting wif cross cultured people across teh globe.
  • Energetic personnel known for ability to envision and create successful outcomes in complex / multicultural environment.



ETL Developer

Responsibilities :

  • Design ETL application and develop Data warehouse applications based on teh technical/functional specifications.
  • Involved in meetings to gather information and requirements from teh adhoc business users.
  • Prepared teh Detailed Design Document for teh all teh modules required for development
  • Designed and developed ETL jobs which extract information from Confidential tables, flat files and load them into an Confidential data warehouse using Informatica and Big data Sqoop,Hive
  • Coordinate development work wif team members, review ETL jobs, and create scripts for scheduling jobs and implementation.
  • Involved in creation of proper test data to satisfy all required test cases and performed unit and system integration testing on all deliverables.
  • Developed data transformation, loading, scrubbing and extraction programs using Ab Initio ETL tool.
  • Designed and Developed teh graphs using teh GDE, wif components partition by round robin, partition by key, rollup, sort, scan, dedup sort, reformat, join, merge, gather, Normalize, concatenate components.
  • Also used teh components like filter by expression, partition by expression, replicate, partition by key and sort Components
  • Worked wif Departition Components like Gather, Interleave in order to departition and Repartition teh data from Multi File accordingly.
  • Create Summary tables using Rollup, Scan & Aggregate.
  • Implemented phasing and checkpoint approach in ETL process to prevent data loss and to maintain uninterrupted data flow against process failures.
  • ETL performance enhancement using data-parallelism (m-file), component-parallelism and pipeline-parallelism, in-memory sorting.
  • Implemented Partition techniques using Multi file system.
  • Involved in writing Shell scripts to create a process involving multiple graphs and to call teh .ksh scripts, SQL queries and UNIX commands. Teh graphs were fully parameterized and teh parameters were passed to teh graphs as environment variables from wrapper scripts..
  • Redesigned teh existing graphs and documented all teh new and enhancement requests.
  • Analyzed teh issues wif teh unmatched records and provided code fix to teh problems.
  • Deployed and execute Ab Initio jobs on UNIX Environment.
  • Good knowledge on UNIX Commands, SQL queries and Confidential Queries etc .
  • Involved in Production L3 Support and solved issues like missing files, storage, loading and logical issues.
  • As an onsite team lead has to represent my team to meet teh Service Level Agreements etc.
  • Worked on other ETL tools like Datastage, Reporting tool Cognos and support tool like Remedy for ticket logging and tracking.
  • Currently involved in job scheduling using Autosys Scheduler and performance tuning of SQL queries by explain plan
  • Intermediate Level knowledge on UNIX shell scripting/Html/Cognos.


ETL Tools: Informatica Power Centre, Datastage, Ab Initio GDE, Ab Initio Confidential >Operating SystemDatabases: Confidential 11g, Confidential, Mssql

Programming Languages: C, C++,, HTMLDefect Tracking: Sharepoint logging

Reporting: Cognos 10.1

Operating Systems: Windows NT/2000/XP/98/7, UNIX.

Emailing: Confidential Lotus Notes and Confidential Outlook, MS Office (Excel, Word, PowerPoint), Big data Hive Sqoop etc

Hire Now