Etl Developer Resume
2.00/5 (Submit Your Rating)
SUMMARY:
- Ambitious, Resourceful and dynamic Big Data/Data warehouse Consultant having 10 + years of professional experience with a strong drive for excellence to satisfy organizational goals and objectives. Worked in large corporations all my career usually in a Consultant role. Excellent at juggling multiple tasks and working under pressure.
- Having 1 Year experience in data modeling.
- 3 Years of experience in Azure data factory Developing ETL.
- Expertise in ETL tool, Informatica Power Center 7.1/8.x/9.6/10.1
- Expertise in Inforamtica Big data management (Informatica BDM).
- Visualize the data in Spark and Hadoop.
- Expertise in Hadoop and Spark.
- Good knowledge on data warehouse concepts.
- Rich experience in ETL Design and implementation.
- Possess excellent communication, analytical, interpersonal skills.
- Knowledge of different Schemas (Star and Snow Flake) to fit Reporting, Query and Business analysis requirements.
- Possess expertise in developing and designing software application.
- Working experience in Data mapping, Data Analysis, and analysis of business requirements.
- Good knowledge of Informatica IDQ.
- Having good knowledge of UNIX shell scripting.
- Having good knowledge of big data using by Vsql(Vertica).
- Having good knowledge of data modeling for Conceptual, physical and logical model.
- Exprerise in Oracle sql develop the complex query and performance tuning.
- Good exposure in Azure and AWS clud and IICS informatica Cloud.
TECHNICAL SKILLS:
ETL: Informatica 7x,8x,9xi,10xiOLAP: Business Objects 6.5./Qlick View Data Modeling: ER Studio, Erwin.
Languages: Vsql, SQL and PL/SQLOperating System: Win 2007, WIN10, UNIX HP - UX
Could technology: . informatica cluod (IICS).DataBase: Oracle12c and Teradata, HP Vertica,Sql-Sever 2012. HIVE.HSQL,Redshift
Test Management Tool: Quality Center 9.2Front End Tool: Toad analyzer, Squirrel.
PROFESSIONAL EXPERIENCE:
Confidential
ETL Developer
Responsibilities:
- Load the data from golden source(AWS) to landing area using Informatica BDM for native and hadoop ecosystem.
- Load data in staging (hive) to integrated all source system from Hadoop ecosystem through informatica BDM job.
- Design the batch load process by creating unix script.
- Schedule the job and perform the batch load process.
- Developed mapping to load the data into fact & Dimensional table built the DW using various transformations and CDC techniques.
- Load data all fact and dimension table in Redshift from oracle trough informatica BDM job.
- Create view for reporting purpose.
- Perform live data using via Kafka through informatica BDM jobs in Oralce tables.
- Handling smooth implementation and unit testing of the application
- Improve the performance tuning through sql qery loading vai informatica jobs.
Confidential
Etl Developer
Responsibilities:
- Load the data from golden source(AWS) to landing area using MDM.
- Load the data through informatica IDQ where as required as per business logic.
- Integreated the Raw data from Hadoop system to Hive using the informatica BDM.
- Create source to target mapping, session and workflow.
- Load data in S3 using python script use CDC technies.
- Create report DASH using the SAS for report card deshbroad.
Confidential
Etl Developer
Responsibilities:
- Work as an ETL Developer using Etl tool informatica.
- Develop the mappings, sessions, workflows as per the ETL Specification.
- Developed mapping to extract the data’s from various source and cleanse the data and load into the staging area using various transformations.
- Developed mapping to load the data into fact & Dimensional table built the DW using various transformations and CDC techniques.
- Create store procedure and dynamic SQL.
- Create complex view for report for stack holders.
- Create materialize view for complex update statement.
- Develop the DQM report for downstream customers.
- Develop the GDPR method for data anonymization.
- Develop Unix script for job automation.
- Move data from one environment to other use Sqoop.
- Develop complex SQL and load from file to HFDS system using Hive.
- Use Optimized query for Hive.
- Work as an Vertica Developer using Vertica Vsql.
- Generate the extract from Teradata legacy system and load the into the Vertica stage
- Load Fact and Dimension table from vertca stage to edcw Vertica TCORE.
- Create Metrix report from TCORE to load into the TCAP.
- Create unix script for generating extract.
- Create VSQL script for loading the table into TCORE,TCAP.
Confidential
ETL Developer
Responsibilities:
- Work as an ETL Developer using informatica tool.
- Develop the mappings, sessions, workflows as per the ETL Specification.
- Developed mapping to extract the data’s from various source and cleanse the data and load into the staging area using various transformations.
- Developed mapping to load the data into fact & Dimensional table built the DW using various transformations and CDC techniques.
- Extract the data from SharePoint and load in vertica db.
- Create mapping for staging area using data cleansing.
- Load the fact & dimension table for report generating purpose.
- Analysis defect and corrective defects improve quality of data &validate the data.
- Work as data quality and checker.
- Using xml source and load the data.
- Done testing with Dev. repo to Q.A repo.
- Create workflow for batch loading for using various tasks.
- Load data from presentation to dimension and fact.
- Direct load for Vsql (file to VerticaSql direct load)
- Create Projections, segment and analytics and hints for improve the performance of database level.
- Load data from flat file to Vertica database using Vsql scripting.
- Done with peer view.
Confidential
ETL Developer
Responsibilities:
- Develop the mappings, sessions, workflows as per the ETL Specification.
- Developed mapping to extract data from various source and to load to the staging area using various transformations
- Developed mapping to load the data into DW using various transformations and CDC techniques.
- Analysis defect and corrective defects improve quality of data.
- Done testing with Dev repo to Q.A repo.
- Create workflow for batch loading for using various tasks.
- Load data from presentation to dimension and fact.
- Done with peer view.
- Worked as an ETL Developer.
- Develop the mappings, sessions, workflows as per the ETL Specification.
- Developed mapping to load the data into DW using various transformations and CDC techniques.
- Load the fact and dimensional table extracting different - different source.
- Analysis defect and corrective defects improve quality of data.
- Done testing with dev repo to Q.A repo.
- Create workflow for batch loading for using various tasks.
- Load data from presentation to dimension and fact.
- Done with peer view.
Confidential
ETL Developer
Responsibilities:
- Develop the mappings, sessions, workflows as per the ETL Specification.
- Developed mapping to load the data into DW using various transformations and CDC techniques.
- Load the fact & Dimension using the etl mapping as per the Olap requirements.
- Migrate folder one repository to another repository.
- Done testing with dev repo to Q.A repo.
- Implement materialize view in Dev database.
- Migrate database sql-server to oracle.
- Done peer view
