Bigdata Etl Developer Resume
4.00/5 (Submit Your Rating)
Tampa, FL
SUMMARY:
- Worked in Data migration from Oracle to postgres using AWS Services and Pyspark which process Daily data ranges from one million to 10 million records.
- Experience in designing data models for lake supporting data from multiple on prem source systems and load data to cloud environment for Analytical Application.
- Experience in creating Hive Internal/External tables pulling structured semi structured and streaming data from different sources and loading into Data Lake.
- Experienced in designing ETL Application to extract data from data lake to load into data marts like S3 and Postgres.
- Designed and Developed ETL Application using Python and spark (Pyspark), Hadoop Postgres SQL methodologies in AWS environment.
- Great knowledge in various AWS Components like EMR Cluster AND YARN Application, S3, RDS, LAMBDA, CLOUDWATCH, EC2 Parameter Store.
- Participating in POC for snow flake, which will replace the Postgres data mart .
- Experience in creating and customization of data objects like tables, procedures, functions, triggers, stored procedures, SQL scripts and packages using PL/SQL.
- Experience in gathering reporting and analysis requirements, implementing the requirements in
- Experience in (Oracle Application Framework) OAF and ADF Portal Development, customization of OAF, ADF Applications, extension and creation of new Applications in J Developer10 and 11 using Java(J2EE/J2SE) and PL/SQL.
- Experience in Using different development environments like Net beans, J Developer, Eclipse, Tableau Desktop, Toad, SQL Plus, Putty, SVN(Subversion), Microsoft Access and Outlook, Git Hub, CA Workstation, squirrel.
PROFESSIONAL EXPERIENCE:
Confidential
Bigdata ETL Developer
Responsibilities:
- Developed ETL pipelines using Python and Spark Data frames and RDD Technics. These ETL jobs are responsible to bring data from Data Lake to the Postgres Mart.
- Implement RDD/Datasets/Data frame transformations in python through Spark Context and Hive Context
- Wrote Spark - SQL and embedded the Sql in python files to generate jar files for submission nto the Hadoop cluster.
- Develop algorithms & scripts in Hadoop to import data from source system and persist in
- HDFS (Hadoop Distributed File System) for staging purposes.
- Familiar with Aws Spark cluster setup and cloud infrastructure set up for ETL Application.
- Wrote lambda function to configure ETL jobs on EMR Cluster and Yarn application logs tracking.
- Generate CloudWatch logs for the ETL Application and print them in CloudWatch logs for risk detection.
- Wrote Postgres functions and tables to drive ETL application loads.
- Familiar with hive external table creation and data set mapping and onboarding them to lake.
- Experience in using EC2 parameter key store, decrypted and encrypted the database credentials for security purpose.
- Create runbook and lead session for EAS team to support the ETL Application in cloud
- Familiar with application deployment to different environments using CI/CD pipeline for mid tier and database applications
Confidential, Tampa, FL
Database Application Developer
Responsibilities:
- Involved in all phases of theSDLC (Software Development Life Cycle)from analysis, design, development, testing, implementation and maintenance in Agile Environment.
- Expertise in Client-Server application development using Oracle 11g/10g, PL/SQL, SQL *PLUS, TOAD and SQL*LOADER.
- Create UNIX shell scripts and external tables to upload data into database tables for processing.
- Implement Oracle SQL queries, triggers and Stored Procedures and batch components as per the design and development related requirements of the project.
- Experience in debugging and Root Cause analysis and provides solutions for better performance.
- Knowledge in implementing SQL queries, Schema Designing, Normalization, and Performance Tuning.
- Experience in developing ESP procedures using CA Automatic workstation using Mainframes.
- Experience in deployment tools like One Jenkins and Task management and integration using Source code Management and Git Hub.
- Participated in Discussions for Oracle to AWS Migration, challenges and possibilities and s for the project.
- Great Knowledge in functional and Technical flow of Finance application of Confidential .
- Experienced in system performance Testing and unit testing, Regression Testing and functional/Technical performance of the application.
- Validate the issues and verifying them and communicating with Developers and resolve issues and thoroughlydocumented test cases using JIRA to minimize the bugs for next sprint.
Confidential, Orlando, FL
Application developer
Responsibilities:
- Analyzed day to day data from warehouse, providing analysis and creating complex dashboards’ using Tableau desktop application.
- Experienced in Service Now CRM application, creating reports and dashboards. As a part of support activity resolved incidents and performed root cause analysis.
- Developed Tableau data visualization using Cross tabs, Heat maps, Box and Whisker charts, Scatter Plots, Geographic Map, Pie Charts and Bar Charts and Density Chart.
- Developed oracle forms and reports in oracle 12g environment. Participated in various data migration tasks. Participates in design, coding, testing, implementation, and documentation of solutions proposed based on business requirements.
Confidential
Oracle Technical Consultant
Responsibilities:
- Developed java-based oracle OAF/ADF (application interface API) framework for different application in oracle EBS modules with excellent understanding of oracle workflow in retail and merchandising industry.
- Involved in full cycle of the SDLC (Software Development Life Cycle) from analysis, design, development, testing, implementation and maintenance with timely delivery against aggressive deadlines and using agile methods like scrum and Jira.
- Developed xml reports using BI/XML publisher and customization of existing reports, forms and PL/SQL procedures in oracle applications.
- Participating in UAT sessions and designing MD120' for business users and them on prod environment.
- Developed and customized forms and reports, interfaces and extension for application. Using oaf and ADF technologies completely developed client server-based applications.
- System integrations of custom application with oracle standard application modules and maintaining them throughout the life cycle.