We provide IT Staff Augmentation Services!

Senior Etl Developer/data Engineer Resume

2.00/5 (Submit Your Rating)

SUMMARY

  • Over 12.5 years of Information technology experience in Data Architecture, ETL Architecture and Development, Business analysis, Data Modelling, Reporting of Enterprise Data warehouse.
  • 3+ years of experience in Azure SQL Database, Azure Data lake, Pipelines, Azure Data factory, Azure Databricks, Azure Cosmos DB, Azure Data Warehouse.
  • Extensive experience in Relational and Dimensional Data modelling for creating Logical and Physical design of Database and ER Diagrams using Erwin data modelling tool.
  • Involved in defining and delivering data quality, governance strategy and in implementing data quality standardization, security and consistency.
  • Experience in Data Profiling, Data Analysis, Data Migration, Data Validation, Data Cleansing, Data Verification and identifying Data Mismatch.
  • Informatica Web Services for Address Validations and Customer Validations.
  • Identified and eliminated duplicates in datasets thorough IDQ 9.1.0 components of Edit Distance and Mixed Field matcher. It enables the creation of a single view of customers, help control costs associated with mailing lists by preventing multiple pieces of mail.
  • Strong experience as Technical Lead in guiding ETL development and support teams and ensured that they understand and fully comply with data quality standards, architectural guidelines and designs.
  • Have 4.7 years of US experience in gathering the requirements and developed feasible solutions by working directly with the clients.
  • Knowledge of the Hadoop ecosystem and its core frameworks, including HDFS, YARN, MapReduce, Pig, Hive, Flume, Sqoop, Oozie, Impala, Zookeeper.
  • Developed methodologies for cloud migration, implemented best practices and helped to develop backup and recovery techniques for applications and database on virtualization platform.
  • Well experienced indefining, designing, integrating and re - engineeringthe EnterpriseData warehouseandData Martsin different environments likeTeradata, Oraclewith various levels of complexity.
  • Knowledge in complete Development life cycle of Data Warehousing - ETL Process.
  • Worked extensively on Erwin and ER Studio in several projects in both OLAP and OLTP applications.
  • Very strong knowledge in relational databases (RDBMS), Data modelling and in building Data Warehouse, Data Marts using Star Schema, Snowflake Schema and physical data modelling with Teradata and Oracle.
  • Good business understanding of Telecom andInsurance domains.
  • Extensive experience in writing functional specifications, translating business requirements to technical specifications, created/maintained/modified data base design document with detailed description of logical entities and physical tables.
  • Possess strong documentation skill and knowledge sharing among Team, conducted data modelling review sessions for different user groups like developers, business analysts, and database Administrators and participated in requirement sessions to identify requirement feasibility.
  • Exceptional communication and presentation skills and established track record of client interactions.
  • Ability to meet deadlines and handle multiple tasks, flexible in work schedules.
  • Expertise in creating Mappings, Trust and Validation rules, Match Path, Match Column, Match rules,

TECHNICAL SKILLS

Big Data and Cloud Technologies: Microsoft Azure, MapReduce, Pig, Hive, Flume, Sqoop, Oozie, Impala, HBaseInformatica Cloud (IICS)

ETL and Reporting Tools: Informatica Power Center 10.6.1, Informatica Power Exchange 10.6(CDC), Business Objects 4.0

Data Modeling Tools: Erwin r7.1/7.2, ER Studio V8.0.1 and Oracle Designer

Scheduling Tools: Control-M, Autosys, Maestro, One Automation

RDBMS and Tools: Teradata,Oracle 9i/ 10g, SQL Server 2012, TOAD, PL/SQL Developer, SQL Developer, SQL*Loader, SQL Server Management Studio

Programming Languages: PL/SQL, UNIX Shell Scripting

Operating System: HP UNIX, Linux, Windows XP, Windows 2003 Server

Other Tools: WinSCP, Putty, Tortoise SVN, Service Now

Documentation: MS Office, Open Office

PROFESSIONAL EXPERIENCE

Confidential

Senior ETL Developer/Data Engineer

Responsibilities:

  • Studied in-house requirements for the Data warehouse to be developed.
  • Conducted one-on-one sessions with business users to gather data warehouse requirements.
  • Analyzed database requirements in detail with the project stakeholders by conducting Joint Requirements Development sessions.
  • Developed a Conceptual model using Erwin based on requirements analysis.
  • Developed normalized Logical and Physical database models to design OLTP system for Field Operations applications.
  • Created dimensional model for the reporting system by identifying required dimensions and facts using Erwin r7
  • Used forward engineering to create a Physical Data Model with DDL that best suits the requirements from the Logical Data Model.
  • Worked with Database Administrators, Business Analysts and Content Developers to conduct design reviews and validate the developed models.
  • Identified, formulated and documented detailed business rules and Use Cases based on requirements analysis.
  • Facilitated development, testing and maintenance of quality guidelines and procedures along with necessary documentation.
  • Worked with project and application teams to ensure that they understand and fully comply with data quality standards, architectural guidelines and designs.
  • Responsible for defining the naming standards for data warehouse.
  • Exhaustively collected business and technical metadata and maintained naming standards.
  • Used Erwin for reverse engineering to connect to existing database and ODS to create graphical representation in the form of Entity Relationships and elicit more information.
  • Used Informatica Designer, Workflow Manager and Repository Manager to create source and target definition, design mappings, create repositories and establish users, groups and their privileges.
  • Data manipulation with SQL, HiveQL, Spark.
  • Administration and monitoring of Big Data environment (Linux, Ambari, HDInsight, ADLS)
  • Data Mart implementation in Hive & SQL.
  • Analyzed, Strategized & Implemented Azure migration of Databases to cloud.
  • Integration between Big Data and Tradition BI environments.
  • Extracted data from the databases (Oracle and SQL Server, DB2, FLAT FILES) using Informatica to load it into a single data warehouse repository.
  • Facilitated in developing testing procedures, test cases and User Acceptance Testing.
  • Integrated the work tasks with relevant teams for smooth transition from testing to implementation.

Confidential, Houston, TX

Senior ETL Lead Developer/Data Engineer

Responsibilities:

  • Conducting requirement gathering sessions, feasibility studies with business analysts.
  • Worked on data cleansing using the cleanse functions in Informatica IDQ.
  • Used Informatica ETL Power Center 9.6 for migrating data from various OLTP databases to the data mart.
  • Worked with different sources like Oracle, flat files, XML files, DB2, MS SQL Server
  • Extracted data from Sales department to flat files and load the data to the target database
  • Developed complex mappings using Informatica to load Dimension & Fact tables as per STAR schema techniques.
  • Extracted data from sources like fixed width and Delimited Flat files transformed the data according the business requirement and then loaded into Target Data mart.
  • Performed in creating tasks in the workflow manager and exported IDQ mappings and executed them using the scheduling tools.
  • Created IDD application and Subject Areas, Subject Area Groups, Deploy and test IDD application, cleanse functions, utilizing timeline, export and import master data from flat file.
  • Involved in preparing required Technical Specification Documents from the Business requirements following Organization standard.
  • Constructed reusable objects like Mapplets and worklet transformations combined with user-defined functions to use across multiple mappings.
  • Executed jobs using ETL framework by setting up scripts and configuration files.
  • Created deployment groups for production migration (mappings, workflows, parameter files and UNIX scripts) and supported post production support during warranty.
  • Practiced soft proofing on JDF enabled products to communicate between each other by
  • Used scheduling tools to create new jobs and job dependencies are setup with different Autosys cycles.
  • Experienced in writing SQL queries for retrieving information from the database based on the requirement.
  • Experienced in partition creations and dimension additions.
  • Successfully implemented IDD using hierarchy configuration and created subject area groups, subject areas, subject area child, IDD display packages in hub and search queries for searching the data for crops, materials and breeders in IDD data tab.
  • Involved in the UAT testing of the Items Data Mart price files tool automation project.
  • Performed testing on connections, scripts, workflows, mappings and other scheduled activities.
  • Worked with the offshore team and supervised on their development activity.
  • Reviewed code and confirmed it was compatible to standard programming practice.
  • Conducted Knowledge transfer sessions about the project to the team and managed the project by providing reviews on it.

Confidential, Enfield, CT

Sr. ETL Designer

Responsibilities:

  • Responsible for Requirement gathering for the IDT Feeds from the business users directly.
  • Responsible for IDT related deliverables to ensure all project goals are met within projected timelines.
  • Responsible for Technical Design Document preparation based on both FR and Non FR’s specifications.
  • Developed IDT feeds using Informatica Mappings and Teradata scripts.
  • Developed and implemented the coding of IDQ Mappings for data profiling, applying rules and develop mappings to move data from source to target systems
  • Developed Teradata View, Procedures and BTEQ scripts to implement the business logic.
  • Interact with technical and business analyst, operation analyst to resolve data issues with IDT.
  • Create Informatica Change Request Document and Database Change Request Document to migrate the changes from DEV to QA and from QA to PROD.
  • Creation ofBTEQ, Fast export, MultiLoad, TPump, Fast loadscripts for extracting data from various production systems.
  • Work with DBA's to design and build the Staging/Target tables, grant synonyms to users, create indexes.
  • Done extensive testing and wrote queries in SQL to ensure the smooth loading of the Data.
  • Coordinating SIT, UAT with QA and business directly and addressing the changes.

Confidential, EAGAN, MN

Sr. ETL Developer

Responsibilities:

  • Responsible for Requirement gathering from the business users directly.
  • Responsible for Technical Design Document preparation based on the Requirement
  • Specifications from the business.
  • Developed Oracle Views, SQL & UNIX scripts, Informatica ETL, PL/SQL, job scheduling scripts required to manage the loading of data from transactional systems in to the data warehouse.
  • Developed and implemented the coding of IDQ Mappings for Address and Customer Validations using Address Doctor.
  • Consumed web services for Informatica MDM Batch Scheduling.
  • Performed unit testing at various levels of the phases.
  • Created scheduling jobs using ESP Scheduler.
  • Supported existing Data Models for the F&R area and enhancing the facts and dimensions as per the new business requirements.
  • Create Informatica Change Request Document and Database Change Request Document to migrate the changes from DEV to QA and from QA to PROD.
  • Work with DBAs to design and build the Staging/Target tables, grant synonyms to users, create indexes.
  • Present the code in Monthly/Weekly Release meeting and place all the documents for SOX Audit.
  • Done extensive testing and wrote queries in SQL to ensure the smooth loading of the Data.
  • Coordinating SIT, UAT with QA and business directly and addressing the changes.
  • Present the code in Monthly/Weekly Release meeting and place all the documents for SOX Compliance.

Confidential

ETL Developer

Responsibilities:

  • Developed process from various source systems to Staging area, from staging to Data Marts
  • Used Informatica designer for designing mappings and Mapplets to extract data from various source systems
  • Created Tasks, Workflows, Sessions to move the data at specific intervals on demand using Workflow Manager
  • Monitored sessions using the workflow monitor, which were scheduled, running, completed or failed
  • Developed complex SCD type-1 and type-2 mappings to load the data from oracle using various transformations
  • Designed and developed Oracle PL/SQL scripts for data import/ export using DB tool - SQL Developer
  • Developed the control files to load data into the system via SQL*Loader
  • Performed unit testing and integration testing for the mappings created
  • Performed complex bug fixes in SIT and UAT to ensure the proper delivery of the developed jobs into the production environment
  • Updated existing models to integrate new functionality into an existing application

Confidential

Production Support Specialist

Responsibilities:

  • Understand the warehouses, sources, and functionally analyze the application domains, get involved in various knowledge transfers from dependent teams, understand the business activities and application programs and document the understandings for internal team referencing.
  • Fixing the invalid mappings and troubleshooting the technical problems of the Database.
  • Work on various data sources like Oracle, SQL Server, Fixed Width and Delimited Flat Files, DB2, COBOL files & XML Files
  • Responsible for Technical Design Document preparation based on the Requirement Specification from the business.
  • Responsible for creating Developer Test Cases and capturing the Unit Test Cases.
  • Responsible for preparing the Weekly and Monthly Service Level Agreement Reports.

We'd love your feedback!