Azure Data Engineer (etl/elt) & Informatica Resume
5.00/5 (Submit Your Rating)
SUMMARY
- Microsoft Azure Data Engineer Associate, currently working as business intelligence (etl developer and tester). Also a certified professional scrum master, have skills working extensively with development tools like Informatica power center. I also use agile development methodologies like scrum to create high performing and self - managing teams.
- Experience in implementing Azure data solutions, provisioning storage account, Azure Data Factory, SQL server, SQL Databases, SQL Data warehouse, Azure Data Bricks and Azure Cosmos DB
- Implementation of data movements from on-premises to cloud in Azure.
- Develop batch processing solutions by using Data Factory and Azure Data bricks
- Implement Azure Data bricks clusters, notebooks, jobs and auto scaling.
- Design for data auditing and data masking
- Design for data encryption for data at rest and in transit
- Design relational and non-relational data stores on Azure
- Preparing ETL test strategy, designs and test plans to execute test cases for ETL and BI systems.
- Creating ETL test scenarios and test cases and plans to execute test cases.
- Interacting with business users and understanding their requirements.
- Good understanding of data warehouse concepts.
- Good exposure and understanding of Hadoop Ecosystem
- Proficient in SQL and other relational databases.
- Good exposure to Microsoft Power BI.
- Good understanding and working knowledge of Python language
PROFESSIONAL EXPERIENCE
Confidential
Azure Data Engineer (ETL/ELT) & Informatica
Responsibilities:
- Understand requirements, build codes, and guide other developers in the course of development activities in order to develop high standard stable codes within the limits of Confidential and clients processes, standards and guidelines.
- Develop Informatica mappings to be implemented based on client requirements and for the analytics team.
- Perform end to end system integration testing
- Involve in functional testing and regression testing
- Review and write sql scripts to verify data from source systems to target
- Using HP quality center to store and maintain test repositories.
- Worked on transformations to transform the data required by analytics team for visualization and business decisions.
- Review plan and provide feedback on gaps, timeline and execution feasibility etc. as required in the project
- Participate in KT sessions conducted by customer/ other business teams and provide feedback on requirements
- Involved in migrating the client data warehouse architecture from on-premises into Azure cloud.
- Create pipelines in ADF using linked services to extract, transform and load data from multiple sources like Azure SQL, Blob storage and Azure SQL Data warehouse.
- Creating storage accounts which involved with end to end environment for running jobs.
- Implement Azure Data Factory operations and deployment into Azure for moving data from on-premise into cloud.
- Design data auditing and data masking for security purpose.
- Monitoring end to end integration using Azure monitor.
- Implementing AAD to specific user roles.
- Deploying ADLS accounts and SQL Databases.
Confidential, Irving TX
Big Data Engineer
Responsibilities:
- Handled data from multiple sources into the HDFS.
- Working on MySQL and Oracle databases to query data as per requirement.
- Developing data warehouse and analyses using various business development tools.
- Doing ETL jobs with Hadoop technologies and tools like Hive, Sqoop and Oozie to extract records from different databases into the HDFS.
- Did preprocessing of data by Hadoop, using different components or tools.
- Import and export data from different sources into HDFS for further processing and vice versa using Apache Sqoop.
- Created workflows to automate the sqoop jobs and data ingestion into the HDFS.
Confidential
ETL Developer/ ETL Tester
Responsibilities:
- Write SQL queries to retrieve information from the databases depending on the requirement and also for data cleanup process.
- Validation of data transformations and perform end-to-end data validation for ETL and BI systems.
- Involved in development of test strategy, test plans and designs, execute test cases for ETL and BI systems.
- Involved in the development and execution of ETL related functionality, performance and integration test cases and documentation.
- Analyze and understand the ETL workflows that have been developed.
- Coordinated with the offshore team on daily basis to execute test cases which as per requirement.
- Worked on data completeness, data transformation and data quality for various data feeds coming from source.
- Perform data modeling and data analysis.
- Support all phases of the project development lifecycle using SDLC and other project methodologies.
- Used various Transformations like Joiner, Aggregate, Expression, Lookup, Filter, Union, Update Strategy, Stored Procedures, and Router etc. to implement the business logic.
- Created Complex mappings using Connected and Unconnected Lookups, Aggregate, Update Strategy and Router transformations for populating target table in an efficient manner.
- Involved in team meetings with Data Modelers, Data Architects and Business Analysts to analyze and resolve the modeling issues and to discuss requirements.
Confidential, Bristol, PA
Informatica Developer and Scrum Master
Responsibilities:
- Attend team meetings with Data Modelers, Data Architects and Business Analysts to analyze and resolve the modeling issues and to discuss requirements.
- Create data mapping and modeling for data migration.
- Create end-to-end solution for ETL transformation jobs that involve writing Informatica workflows and mappings.
- Used various Transformations like Joiner, Aggregate, Expression, Lookup, Filter, Union, Update Strategy, and Router etc. to implement complex logics while coding a Mapping.
- Develop Informatica SCD Type-2 Mappings based on the requirements.
- Test all the mappings and sessions in Development environment
- Remove any impediment that will slow down the team’s progress.
- Built relationship with Product owner and other stake holders to facilitate team's interaction with them
- Coach team members on Agile principles and providing general guidance on the methodology and to become a self-governing high performance team.
- Work with Product Owners and assist in maintaining and prioritizing the product backlog, writing and breaking down user stories, estimate user stories and also refines the team work capacity.
- Organized and facilitated project/Sprint planning, daily stand-up meetings, reviews, retrospectives, sprint, release planning, demos, and other Scrum-related meetings
- Track and effectively communicated team velocity and sprint/release progress to all affected teams and management
- Assist product owner in providing transparency to the work and project status.
- Worked on bug or issue tracking using JIRA tools, and manage the progress of the project with JIRA.
- Collaborate with scrum teams to effective manage work and also resolve issues and conflict.
- Create and maintain work visibility using Kanban boards and other Agile tools like Jira, and charts.