Informatica And Teradata Developer Junior Resume
SUMMARY
- 5 years of IT experience in Software Development, Analysis, ETL, Requirements Gathering Data warehousing for clients in Health care and Retail domains.
- Develop and review the code Teradata with Informatica PowerCenter 9.5.1, 10.0 and,
- Usage of Debugger, push down techniques in Informatica
- Understand the data model, create source to Target mappings for the ETL process
- Creating Source to Target Mapping and Value - added processes (VAP) in ETL Design knowledge on prescribing Indexes and usage of tables to be created in Staging, Target areas like Temporary, SET, Multi-Set, Volatile, global temporary tables
- Used Teradata utilities - Fast load, Multi load (Mload), BTEQ, Fast Export, TPT
- Ability to write unit test cases, scenarios and validating unit test results
- Creating mapping, sessions and workflows in Informatica Power Center ETL, usage of different transformations and handled files (csv), Database like Teradata as source and target
- Gather requirements, write design documents, engage in data analysis along with Business Analysts, Enterprise Architects
- Worked on Type, I II data model and perform data cleansing before commencing the ETL process for the data from various source systems
- Good understanding of Teradata SET, Multiset tables
- Good understanding of AWS cloud concepts
- Excellent understanding of SCM tools such as SVN, Git, BitBucket and GitHub
- Strong exposure to creating Docker images and docker containers.
- Experience in various roles as DevOps, Cloud Engineer, Build and Release Engineer with excellent experience in Software Integration, Configuration, Packaging, Building,
- Automating, managing and releasing code from one environment to another environment and deploy to servers, support and maintenance under Unix/Linux/VM',s Platforms.
- Strong exposure to both AWS and Azure cloud platforms
- Hands on experience in scripting languages such as Groovy, JSON, YAML, Shell scripting.
- Extensive experience in setting up CI/CD pipelines using tools such as Jenkins,
- TeamCity, Maven, Nexus, Slack and VSTS.
- Experience in integrating code quality tools such as SonarQube, JaCoCo, Veracode in
- CI/CD pipelines.
- Strong exposure to configuration management tools such as Ansible, Puppet,
- Terraform and Docker
- Strong knowledge on practicing TDD, automating Junit tests using Maven in Jenkins.
- Strong knowledge in Tomcat, WebLogic servers on different OS’s like Windows, Linux
- VMware, UNIX and Solaris platforms.
- Participated in the release cycle of the product which involved environments like
- Development, QA UAT and Production.
- Worked with project documentation and documented other application related issues, bugs on internal wiki website.
- A highly motivated, energetic individual, a team player with excellent communication and inter-personal skills
TECHNICAL SKILLS
Database and systems: Teradata, SQL Server
Teradata Utilities: Mload, Fast Load, BTEQ, Fast Export and Tpt
ETL: Informatica Power Center 9.5.1, Power Center 10.2
Working Methodologies: Jira
Operating Systems: Windows, Unix, Groovy, JSON
Other Client Tools: SQL Assistant, Visio, Teradata Studio
Cloud Concepts: AWS concepts, Azure
Build Tools: Maven
SCMs: SVN, Git, GitHub, BitBucket
IAC Tools: Puppet, Ansible, Terraform
Containers: Docker
Application/Web Servers: Tomcat, WebLogic 9.x/10.x/12c, Apache 2.x/1.3x
PROFESSIONAL EXPERIENCE
Confidential
Informatica and Teradata developer Junior
Responsibilities:
- Create Informatica mapping, workflows, using various transformations for performing ETL aspects of the data migration
- Performance tuning on Informatica jobs in PowerCenter, utilize Push down optimization
- Analyse the data match and the tables across various databases to modify the source
- Create the mapping using transformations like Expression, Joiners, Filter, Update strategy, Lookup
- Create and track tasks in Jira
- Create tables, views in Teradata
- Define an approach to perform the stats operations on the newly created and modified tables.
- Calculate necessary space in GB for new incoming data and raise request to DBA
- Implement performance tuning techniques and write advance sql queries in finding out the equivalent technical columns for the corresponding business terms
- Uses Sql Server Management Studio to create tables in sql server and populate data from APIs
Confidential, Plano, TX
Informatica and Teradata developer Junior
Responsibilities:
- Set up Git repositories and SSH Keys in Bitbucket for Agile teams.
- Helped teams to configure Webhooks in Bitbucket to trigger automated builds in Jenkins.
- Used Terraform and Ansible, migrate legacy and monolithic systems to Amazon Web Services. Used Terraform scripts to configure AWS resources.
- Wrote Ansible playbooks from scratch in YAML. Installing, setting up & Troubleshooting Ansible, created and automated platform environment setup.
- Set up CI/CD pipelines for Microservices and integrated tools such as Maven, Bitbucket, SonarQube, Nexus, Docker, Slack for providing immediate feedback to DEV teams after code check-in.
- Setting up Jenkins master, adding the necessary plugins and adding more slaves to support scalability and agility.
- Created Docker file and automated docker image creation using Jenkins and Docker.
- Automated infrastructure provisioning on AWS using Terraform and Ansible.
- Wrote Ansible playbooks from scratch in YAML. Installing, setting up & Troubleshooting Ansible, created and automated platform environment setup on AWS cloud.
- Created nightly builds with integration to code quality tools such as SonarQube, Veracode.
- Created quality gates in SonarQube dashboard and enforced in the pipelines to fail the builds when conditions not met.
- Converted java projects into Maven projects by creating POM file and ensured all the dependencies are built.
- Worked on integrating GIT into the continuous Integration (CI) environment along with Jenkins.
- Manage/mentor both onsite/offshore team.
- Enforced Test Driven development for the DEV teams for every sprint
Confidential
Software Developer
Responsibilities:
- Create DDLs for tables and views in SQL Developer based on the metadata sheet and the Logical data model
- Import the definitions for Sources, targets in the powercenter from the SQL server tables and CSV files
- Import XML files as few of the systems will have data coming in XML layouts
- Modify the structure of the tables and their corresponding defnitions everytime there is an update
- Create Informatica mapping, workflows in powercenter, Utilize transformations like Joiner, Filter, Router, Source Qualifier, Look Ups etc
- During the testing, create temperoary tables and volatile tables
- Enable correct Primary keys for the tables are in place, datatype is handled.
- Write test cases and perform Unit and System testing to validate the data flow and capture any data loss
- Undergone training in programming and database concepts