We provide IT Staff Augmentation Services!

Designer, Developer, Onsite Coordinator Resume

2.00/5 (Submit Your Rating)

Austin, TX

CAREER SUMMARY:

  • Overall 13+ years of experience with a zest for working in a fast - paced environment with complex problems or business challenges in Data Warehousing and ETL in Cards and Payments, Banking and Finance services, Retail and Insurance domains.
  • Good interpersonal skills, commitment, result oriented, hard working with a quest and zeal to learn new technologies and undertake challenging tasks.
  • Hadoop/Hive/Spark/Python/Ab Initio Professional with13+ years of experience
  • Good understanding of HDFS and the complete Hadoop Eco-system.
  • Strong knowledge in Hadoop Ecosystem including Spark, Hive and Pig.
  • Have experience of transferring incoming data from various application servers into the Hdfs and Hive Metastore using Python.
  • Have very good experience in processing data using Pig scripts and Hive queries.
  • Have strong working exposure of creating and maintaining Hive tables and importing data into hive tables and extract required data based on requirements.
  • Have knowledge on AWS services EC2, IAM, S3, Snowflake, Cloud Watch, Redshift, EMR, VPC, ELB, RDS, and EBS.
  • Extensive experience in various databases like Teradata, DB2 and Oracle.
  • Worked on various implementation tools like GitHub, Jenkins repository.
  • Have Worked as Ab Initio COE and helped resolving queries raised by users from different vendors working for Allstate
  • More than 5 years’ of experience working directly with Business users to understand and draft requirements at client location, in USA and UK.
  • Proven ability to code and test complex, scalable programs and meet stringent deadlines.
  • Have Strong experience in Designing, Developing, Reviewing ETL solutions and graphs and doing performance tuning of ETL graphs, SQLs, HQLs and Unix/Linux Scripts.
  • In ETL, have worked on various Transform functions, Parallelism, Version control, PDL programming, Continuous flows, CDC Golden gate component, Vector format processing, UMF format processing, ICFF lookup as well as database components
  • Have strong experience in Data Analysis, Requirements Gathering, Gap Analysis, Production Support, Incident/Problem Management, Release Management, Configuration Management, and Integration Testing.
  • Strong knowledge of Data Warehousing concepts and Dimensional modeling like Star Schema and Snowflake Schema.
  • Have rice experience of working as onsite coordinator, and in multiple projects, I have led the team of average size 10-15 members including offshore.
  • An individual with excellent interpersonal and communication skills, strong business acumen, creative problem-solving skills, technical competency, team-player spirit, and leadership skills.
  • Strong working experience in Scrum / Agile framework and Waterfall project methodologies.

TECHNICAL SKILLS:

ETL Tools: Ab initio (GDE 3.0 Co>Operating System 3.2), Data Stage

Database: ORACLE, Teradata, DB2 Open Source

Big data: Hadoop Technologies HDFS, Hive, HQL, PIG script

Operating Systems: Unix, Microsoft Windows XP/2003/2000

Programming: Python, PySpark, Unix/Linux Shell Scripting, PL/SQL,C++

Scheduler: Tivoli, Ab initio Plans, ESP, Autosys

AWS Services: EC2, S3, Snowflake, RedShift, EMR, Cloud Formation, IAM, Auto Scaling, EBS, ELB

IDE/Tools: Eclipse

Other Tools: SQL*Plus, Teradata SQL Assistant, Rational Clear Quest, VIPER, Jenkins, GIT-Hub

PROFESSIONAL EXPERIENCE:

Confidential

Designer, Developer, Onsite coordinator, Austin, TX

Technologies & Tools: AbInitio, Hadoop File System, UNIX, DB2, MAINFRAME, ESP, Viper, ClearCase, HDFS, Hive tables, HQLs, Pig scripts.

Responsibilities:

  • Understanding the current GMBS system, ETL processing and data flow and performing impact analysis for new requirements and changes in every business release.
  • Designing and developing new Ab Initio graphs, Unix scripts, extract and load HQLs as per the business requirement
  • Creating Pig scripts to handle data processing and providing downstream for reporting for certain SIF
  • Performing data validation to identify & rectify any Data Quality issue.
  • Creating HQLs, and Pig scripts for performing transformation data summarization
  • Creating reports using Python and Pyspark to feed data from hive tables as per business requirement
  • Creating Change request in AskNow (Change management portal) for code migration and installation in production and developing new ESP jobs to perform production execution depending on run frequency of project
  • Planning, Effort Estimation, Work allocation to team members and tacking and status reporting for timely completion of deliverables for production releases.
  • Helping Team members on any technical & application related challenges.
  • Interacting with client and business users to discuss and resolve issues and challenges in design and development.
  • Performing Code reviews and performance tuning
  • Production implementation and warranty support.
  • Investigation and causal analysis of issues in SIT and UAT

Confidential

Designer, Developer, Onsite coordinator, Austin, TX

Technologies & Tools: Hadoop File System, UNIX, Hive tables, HQLs, DB2, MAINFRAME, ESP, Viper, GIT bitbucket and Jenkin

Responsibilities:

  • Understanding requirement from Business and translating that into Conceptual Approach Document.
  • Creating design document and reviewing it with business.
  • Developing data extractions, PAN encryption and data enrichment to populate agreed fields in Final Hive table using HQLs.
  • Building a process to scoop data from Hive tables (HDFS) to DB2 table for API reporting
  • Built a process to make consistent data available 24x7 while the updates in DB2 tables are happening through daily load jobs
  • Performing data validation to identify & rectify any Data Quality issue.
  • Developing queries for Data validation queries and debugging in production
  • Creating Change request in AskNow (Change management portal) for code migration and installation in production and developing new ESP jobs to perform production execution depending on run frequency of project
  • Conducting knowledge transfer sessions for Production support team to hand over the support activities after prod install in production.
  • Planning, Effort Estimation, Work allocation to team members and tacking and status reporting for timely completion of deliverables for production releases.
  • Helping Team members on any technical & application related challenges.
  • Interacting with client and business users to discuss and resolve issues and challenges in design and development.
  • Performing Code reviews and performance tuning
  • Production implementation and warranty support.
  • Investigation and causal analysis of issues in SIT and UAT

Confidential

Consultant, Onsite coordinator, Austin, TX

Technologies & Tools: AbInitio, UNIX, HIVE, HDFS, DB2, MAINFRAME, ESP, Viper, ClearCase

Responsibilities:

  • Understanding the current Visanet system, ETL processing and data flow and leveraging that for Designing NSPK.
  • Designing and developing new Ab Initio graphs, Unix scripts, extract and load HQLs as per NSPK business requirement
  • Performing data validation to identify & rectify any Data Quality issue.
  • Created, Debugged and modified various shell scripts in UNIX/Linux environment.
  • Creating changes request in change management system, and creating packages for code migration and installation
  • Conducting knowledge transfer sessions for Production support team to hand over the support activities after prod install in production.
  • Planning, Effort Estimation, Work allocation to team members and tacking and status reporting for timely completion of deliverables for production releases.
  • Helping Team members on any technical & application related challenges.
  • Interacting with client and business users to discuss and resolve issues and challenges in design and development.
  • Performing Code reviews and performance tuning
  • Production implementation and warranty support.
  • Investigation and causal analysis of issues in SIT and UAT

Confidential

Project leader, Onsite coordinator, Northbrook

Technologies & Tools: Ab-Initio, Unix, Oracle

Responsibilities:

  • Understanding the requirement specification and mapping rules for IDM Quotes, House and home Policy and cross sell modules and create a low level design
  • Provide effort estimates, resource utilization and delivery milestone
  • Leverage my knowledge of Ab-initio, Unix along with strong application development experience at offshore to develop and support IDM HnH Policy and Quotes Acquisition Funnel module
  • Performing Code reviews and performance tuning
  • Production implementation and warranty support.
  • Investigation and causal analysis of issues in SIT and UAT

Confidential

Project leader, Offshore

Technologies & Tools: Ab-Initio, Unix, Oracle

Responsibilities:

  • As a Ab initio developer in the team was responsible for developing graphs in Ab inito as per the detailed design
  • Working on UNIX scripts of the projects
  • Reviewing AI graphs prior to migration to test environments
  • Creation and enhancement of Ab Initio Graphs, Wrapper Scripts, Performance tunings of graph, sql.
  • Creation of Jobs and streams for the project in TWS
  • Analysis, Design and develop codes in Ab Initio and UNIX
  • Conducted several design reviews and UTR reviews with Client team and onsite team.
  • Interacting with Allstate BAs and tech leads and onshore counterpart to resolve functional queries
  • Lead the offshore development team, and provide necessary support on any technical or functional issue

Confidential

Module leader and Onsite Coordinator, Northbrook

Technologies & Tools: Ab-Initio, Unix, Oracle

Responsibilities:

  • Involved in design of dimension and fact for IDM Claims Module
  • Involved in data analysis and requirement analysis using D710
  • Coordinating with offshore team and Allstate BAs to get the queries and concerns resolved proactively
  • Involved in preparation of Technical design document (T235)
  • Solving production issues and was involved in production support
  • Investigation and causal analysis of various Production problems

Confidential

Module leader, Offshore

Technologies & Tools: Ab Initio(GDE 1.15 and Co>Operating System 2.15),UNIX, Teradata

Responsibilities:

  • Understanding the complete work flow of the project
  • Involved in design, development, testing and implementation of ISO premium, paid and outstanding losses
  • Solving production issues and was involved in production support
  • Investigation and causal analysis of various Production problems assigning and delegating tasks to the team members and tracking them to the closure

We'd love your feedback!