Manager, Data Engineering Resume
Wilmington, DE
TECHNICAL SKILLS
AWS Services: IAM, S3, EC2, VPC, Redshift, EMR, Kinesis, Machine Learning, CloudTrail, CloudWatch, Lambda, RDS, Route 53, CloudFormation, Elastic Beanstalk, Dynamodb, SQS, SES, SNS, Docker, OpsWorks
Languages: SQL, Java, Python, Scala, C, C++, HTML, Visual Basic, JavaScript, UNIX Shell Scripting, Perl, XML, JSON, Android Studio
ETL Tools: Informatica PowerCenter, Ab Initio GDE
Operating Systems: LINUX, UNIX, Windows 10/8/7/Vista/XP/NT/2000/98
Databases: Oracle, Teradata, SQL Server, MS Access, DB2, MS Access
Distributed File Systems: Cloudera Hadoop, HDFS, HIVE, PIG, IMPALA, HUE, SPARK, Data Registry, Nebula, DQMR, DOCS Frameworks
Packages: Toad, Teradata SQL Assistant, SQL Developer, SQL PLUS, WINSQL, SQL Loader
Reporting Tools: Business Objects, Cognos, Tableau
CI CD & Automation: GitHub, Jenkins, IBM uDeploy, COF TAFFY Testing Framework, Artemis Framework
Modelling Tools: Aqua Data Studio 17, ERWin, Embarcadero, MS Visio
Scheduling Tools: Tidal, Control M
AGILE Tools: Version One, JIRA
PROFESSIONAL EXPERIENCE
Confidential, Wilmington, DE
Manager, Data Engineering
Responsibilities:
- Experience building Production Redshift Cluster on AWS Cloud for Retail & Direct Bank
- Outlined end to end AES - 256 Encryption standards for Redshift data loading and unloading using Lambda & KMS Keys. Built Lambda functions in Python to encrypt files in S3 before loading using cp command to Redshift
- Experience of building Data Analyst Community User Sandbox in Redshift, S3 and PostgreSQL on EC2
- Drafted strategies for data movement across Hadoop Lake Edge node server to S3 to Redshift Cluster using EC2 instance.
- Scripted CloudFormation templates for setting up Dev, QA and Prod Redshift cluster instances
- Drafted and executed DR strategy for Production Redshift Cluster instance
- Built Redshift - Tableau connectivity for Retail Bank DA Community
- Outlined User Access Policies, Modeling standards for Retail Bank User Community for Redshift Cluster
- Adopted Enterprise Metadata Registry tool called Nebula for Retail Bank Cloud objects
- Built CICD pipeline using in-house Artemis DevOps tool from story writing in Version One & JIRA all the way to production deployment including automated change management process
- Delivered across multiple environments like Cloudera Hadoop Data Lake, Ab Initio DDE and legacy Confidential DIRECT ODW systems
- Peer Reviewed code and design patterns from Team members and provided best practices recommendations
- Delivered data solutions within EDS Home Loans and Bank Retail/Direct LOBs. Successfully managed deliveries remotely with distributed team setup
- Experience with DQMR, DOCS, Java, Scala, Python, Cascading, SnapLogic, AWS Redshift, GitHub, CI/CD, Jenkins
- Built large scale reliable applications that impact the way Confidential does business. Participated in detailed technical design, development and implementation of applications using existing (Abinitio/DDE) and emerging technology (Hadoop/HDFS) platforms.
- Have been accountable for delivery of multiple AGILE initiatives across multiple project platforms in SCRUM and KANBAN methodologies. Responsible for leading the team and helping associates manage their career
- Worked in an Agile environment, and actively participated in story planning and detailed daily stand ups, while providing input for architectural design decisions to meet story acceptance criteria
- Continuously developing technical knowledge and skills on multiple platforms. Educated team members (including Scrum Masters and Product Owners) on the capabilities and constraints of technologies
Scrum Master/ Sr.Business Systems Analyst
ConfidentialResponsibilities:
- Created a culture of continuous improvement within the team. Continuous coaching of the team and stakeholders around the team to realize their highest potential. Fostered open communication and transparency through building trust within teams. Facilitated other team members facilitate Scrum ceremonies
- Gathered business and technical details in close collaboration with Product Owner and Technical team in BSA role and drove grooming sessions
- Communicated committed stories for the upcoming sprint to the Product Owner and stakeholders. Supported the Product Owner and Product Manager maintain data in the appropriate Agile tool set to provide transparency to the road map and related priorities
- Management of sprint team data in Version One. Represented the team at the Scrum of Scrums (S2) meeting to report progress. Proactively helped the team identify and resolve impediments. Facilitated impediment resolution in a timely, effective manner
- Facilitated retrospective meetings at the end of each sprint. Ensured highest priority retrospective points are followed up on and closed out. Monitored team backlog in V1 daily and ensured it accurately reflected the current state, including time to be burned and status.
- Active in Agile communities (internal and external) and shared best practices and lessons learned. Managed relationships with other component teams necessary for product release in a matrixed and distributed environment
Confidential
Sr ETL Application Developer
Responsibilities:
- Created and managed Workflows in Workflow Designer and executed Tasks such as Sessions, Commands, etc using Informatica 8.1 Workflow Manager
- Created Repository and Maintained source definitions, transformation rules and targets definitions using Informatica Repository Manager
- Created Source definitions, Target definitions, Mappings, Transformations, Reusable Transformations, Mapplets using Informatica Designer tool, which includes Source Analyzer, Warehouse Designer, Transformation Developer, Mapping Designer and Mapplet Designer
- Created Data Breakpoints and Error Breakpoints for debugging mappings using Informatica Debugger Wizard
- Written pre and post session Scripts for mappings
- Worked on Performance tuning for ETL processes
- Used Informatica Repository Manager to backup and migrate metadata in Development, Test and Production systems
- Extracting, Scrubbing and Transforming data from Flat Files, Oracle, SQL Server, and then loading into Oracle database using Informatica
- Created PL/SQL Stored Procedures in Oracle
- Excellent Exposure to BI Analysis Tools like Cognos PowerPlay, Reportnet, Report Studio, Impromptu and Macros
- Experience with Oracle Explain Plans for performance tuning
- Familiar with PL/SQL Packages in both, an Application Transaction processing context and a Batch Processing (ETL) context
- Worked with advanced Oracle features like Parallel query, advanced queuing, database job processing
- Developed UNIX Shell Scripts to run various Jobs
- Developed Pre-Session and Post-Session UNIX scripts to automate the data load processes to target Data warehouse
- Thorough understanding of exception handling and propagation in PL/SQL
- Used PL/SQL Debugging tools like TOAD and SQL Developer
- Actively involved in support for User Acceptance Testing and Production Support
Confidential, South Plainfield, NJ
Programmer Analyst
Responsibilities:
- Interacted as a ETL Lead Developer with the Client’s technical contact to understand and clarify requirements
- Creating and managing Workflows in the Workflow designer and executing tasks such as Sessions, commands, etc using Informatica 7.2 Workflow manager
- Experience in Transaction management including autonomous transactions and understanding of rollback segment usage
- Understanding of exception handling and propagation in PL/SQL
- Used PL/SQL Debugging tools
- Creation of Repository and Maintaining source definitions, transformation rules and targets definitions using Informatica Repository Manager
- Created Source definitions, Target definitions, Mappings, Transformations, Reusable Transformations, Mapplets using Informatica Designer tool, which includes Source Analyzer, Warehouse Designer, Transformation Developer, Mapping Designer and Mapplet Designer
- Written pre and post session scripts for mappings
- Coordinated in training sessions and demonstration of Informatica Tool and Usage to Client Associates
Confidential, Stamford, CT
Programmer Analyst
Responsibilities:
- Extracting, Scrubbing and Transforming data from Flat Files, Oracle, SQL Server, and then loading into Oracle database using Informatica
- Created PL/SQL Stored Procedures to sync data from Oracle (SIMON) to DB2 (FACTS) Databases
- Developed UNIX Shell Scripts for Scheduling purposes
- Worked with advanced Oracle features like parallel query, advanced queuing, database job processing
- Worked on performance tuning for ETL processes
- Actively involved in support for User Acceptance Testing and Production Support
- Coordinated in training sessions and demonstration of application and usage to end-users
- Used Informatica Repository Manager to backup and migrate metadata in development, test and production systems
- Improved performance by identifying and rectifying performance bottlenecks
