We provide IT Staff Augmentation Services!

Senior Developer Resume

5.00/5 (Submit Your Rating)

SUMMARY:

  • A committed, performance driven engineering professional with over 8 years of experience in bringing value to organizations through a blend of technical leadership
  • Business knowledge, and team focus. Proven ability to assess situations and implement solutions to improve efficiencies, reduce rework and meet strategic goals.
  • Consistently recognized for meeting critical business objectives by leading and motivating diverse and cross - functional teams across several geographical locations, different industries and cultures.
  • Over 8 years of full software development life cycle experience including design, implementation and support roles.
  • Around 5 years of Big Data/Hadoop experience including Map Reduce, Pig, Sqoop and Hive. MapR Certified Hadoop Developer (MCHD). Involved in various steps of data flow processes including Design and Support Operations.
  • Proficient experience in design and development of DATASTAGE jobs. Worked on design and development of datastage parallel and sequencer jobs.
  • Experience in using Teradata and Teradata Load utilities like Fast Export, Fast Load, Multi Load, and TPT.
  • In-depth experience in designing and supporting batch job scheduling processes.
  • Have working experience in python.

TECHNICAL SKILLS:

ETL tools: IBM InfoSphereDataStage 8.5/7.5,11.5

Databases: Teradata, DB2, SQL Server, Oracle

Big Data/Hadoop: Hive, Pig, Sqoop, Oozie, HDFS, HBase

Search engine: Elasticsearch

Operating System: Linux, AIX, Windows 2000/XP/7

Tools: /Utilities: Visual Studio - Team foundation Server, GITHUB, Version one, Jenkins, HIPCHAT, VISIO, Teradata SQL Assistant, SCM

Languages/Scripting: UNIX Scripting, Pig, HiveQL, SQL, Python, Java

Incident/Change Management: Service now, HPSD

Deployment Tools: TAD, SSD

Reporting Tools: MicroStrategy, Webfocus, Tableau

Scheduling Tool: Control-M, Jobtrac, Autosys

Methodologies: DevOps, Agile and Waterfall

Other tools: Word, Excel, PowerPoint, Visio, SharePoint

PROFESSIONAL EXPERIENCE:

Confidential

Senior developer

Responsibilities:

  • Take part in analysis and high-level discussions.
  • Closely work with solution architect in technology decisions and environment setup.
  • Prepare the low-level design document for the future design as per the requirements.
  • Construct ETL/ELT batch jobs, tableau reporting solution as per design.
  • Preparing test plan and peer review of team member coding and test results.
  • Support transition of newly developed jobs and reports.
  • Implement the job in stage and verify whether jobs are running fine.
  • Monitor the jobs and delivery the deliverables to the clients.
  • Review and coordinate offshore tasks

Environment: Datastage 11.5, Teradata, Oracle, Shell scripting Java, SCM, Jobtrac, Autosys, Tableau, Windows and AIX

Confidential

Senior developer

Responsibilities:

  • Analyze the existing DataStage jobs, group the related jobs and prepare analysis documents.
  • Coordinate with portfolio teams to check if any group of jobs not require for remediation.
  • Prepare the design document for the future design.
  • Construct the Hadoop and java camel/springboot framework jobs according to the new design.
  • Preparing the test plan and peer review of team member coding and test results
  • Implement the job in stage and verify whether jobs are running fine.
  • Support transition of newly developed jobs and reports.
  • Monitor the jobs and delivery the deliverables to the clients
  • Review and coordinate offshore tasks

Environment: Datastage, Oracle, DB2, Java Springboot and Camel framework, eclipse, GIT/GitHub, Rundeck, Control M, ServiceNow, AQT, Hive, Pig, Oozie, Sqoop.

Confidential

Senior developer

Responsibilities:

  • Responsible for building scalable distributed data solutions using Hadoop
  • Installed and configured Hive, Pig, Sqoop and Oozie on the Hadoop cluster
  • Setup and benchmarked Hadoop clusters for internal use
  • Developed data load jobs using Hive and Pig
  • Optimized Map/Reduce Jobs to use HDFS efficiently by using various compression mechanisms
  • Handled importing of data from various data sources, performed transformations using Hive, Pig, MapReduce, loaded data into HDFS and Extracted the data from various RDBMS into HDFS using Sqoop
  • Analyzed the data by performing Hive queries and running Pig scripts to study customer behavior
  • Continuous monitoring and managing the Hadoop cluster using Cloudera Manager
  • Worked with application teams to install operating system, Hadoop updates, patches, version upgrades as required
  • Installed Oozie workflow engine to run multiple Hive and Pig jobs
  • Exported the analyzed data to the relational databases using Sqoop for visualization and to generate reports for the BI team

Environment: Hadoop, MapReduce, HDFS, Hive, Pig, HBase, MongoDB, Java, SQL, Cloudera Manager, Sqoop, Oozie, Elasticsearch, Java (jdk 1.8), Eclipse, DataStream, Control M, ServiceNow.

Confidential

Senior developer

Responsibilities:

  • Designed HLD document by understanding the business requirements and reporting needs
  • Data profiling about the source system.
  • Pulled and processed the transformed info into data warehouse system as input to ETL jobs
  • Executed entire project with Agile methodology
  • Developed the Datastage jobs as per the STM document using Datastage stages like Sequential File, Dataset and Transformer stages.
  • Modified the logical & physical model development with the latest changes to in corporate Report subscription
  • Involved in creation of Unit Test cases, Integrated Test Cases and involved in Unit Testing and System Testing and Coordinating with testing team for writing and execute the system testing
  • Generated the aggregated data as per Vendor specific as per business needs
  • Ensured that timely communication with vendor side SPOC and Stakeholders on the project related activities /Status and Production issues.
  • Primary responsibility for Quality reviews of code/design specs and act as a gate keeper before the code/design reaches the client
  • Make sure that all the releases and monthly releases were delivered with quality and on time.
  • Responsible for all Data issues related recoveries in Production
  • Supporting the developed jobs in SIT, pre-prod and PROD environments until it gets signed-off.

Environment: Hadoop, IBM Datastage v8.5, Teradata SQL Assistant, MSTFS, Putty, Control-M scheduler, Teradata, UNIX Shell Scripting, MSTFS, Erwin

Confidential

Production support executive

Responsibilities:

  • Monitor production batch jobs with 100% accuracy and integrity which helps in current and future business growth.
  • Ensure availability of Business Intelligence reports for end users to take key business decisions.
  • Coordinate and manage the changes and maintenance of applications for Target Finance & BI Systems
  • Validate and confirm functional and non-functional requirements with the Client (e.g., security, capacity).
  • Provide enhancements in the Business Intelligence & Enabling applications like Info Retriever, Enterprise Data Warehouse (EDW) application in Business Intelligence portfolio
  • Study business process and identify areas for improvements in both technology and Business for future Business growth.
  • Perform event response, incident and problem management (break/fix) by Partnering with development teams to ensure systems are built for stable supportability.
  • Formulate the queries to validate the quality of the data loaded for end user reporting.
  • Review the data models and mapping documents. Perform code walkthroughs to assure standards.
  • Performed Incident Management using the HPSD (HP Service Desk)/Service now tool as part of managing Process.
  • Responsible for performance tuning the complex and Critical batch flow to meet LOS required for Business users.
  • Based on the Business requirement from clients, have implemented new production batch jobs for Foundational BI projects
  • Collaborate with solution engineering and data modelling team on the Design to ensure supportability and Stable production systems.
  • Used OLAP tools like MicroStrategy, wefocus and have performed enhancements to meet the client requirements
  • Maintain various versions of code using MSTFS which enables unison across different environment like DEV, STAGE, and PRODUCTION.
  • Suggested proactive ideas in Enabling and Business Intelligence application area to automate the manual process followed to preserve the time spent on manual works. Have identified the design bug in existing application design and have provided solution to correct it.

Environment: Hadoop, MapReduce, HDFS, Hive, Pig, HBase, Java, SQL, Cloudera Manager, Sqoop, Oozie, Elasticsearch, IBM DataStage v8.5, V7.5, Teradata SQL Assistant, MSTFS, Putty, Control-M scheduler, Teradata, UNIX Shell Scripting, MSTFS, Erwin.

Confidential

Developer

Responsibilities:

  • Identifying & analyzing the jobs which are need to be migrated to 8.5 server
  • Finding out the alternative tables for those effected jobs in Teradata
  • Designed new Teradata queries for 8.5 Datastage with Teradata as source by finding alternative tables
  • Had a challenging task of not to create change/impact to existing agency data consumers applications.
  • Worked with Enterprise architect and Client side SME to define the integration layer and Agency Data model
  • Analyzed the alternative Teradata results with existing system.
  • Preparing Approach and Low Level Design documents based on the new sources and new approach.
  • Designed and coded the ETL jobs as per new sources in order to generate the feed files
  • Involved in abort handling, restarting and recovery process, data clean-up, validation and performance monitoring of Datastage jobs, clearing job logs
  • Primary responsibility for Quality reviews of code/design specs and act as a gate keeper before the code/design reaches the client.
  • Defined the Integration and Batch layer production implementation plan and executed same without even a minor backout.
  • Responsible for all Data issues related recoveries in Production

Environment: IBM Datastage v8.5, Teradata SQL Assistant, MSTFS, Putty, Control-M scheduler, DB2 V9, Teradata, UNIX Shell Scripting, IBM Datastage v7.5Team Size: 6

Confidential

Developer

Responsibilities:

  • Design the Architecture for automation of Data Migration and Testing.
  • Designing the Job flow according to our Project requirements.
  • Designed End2End automation with complete
  • Development of Generic ETL Jobs and UNIX scripts.
  • Designing Control-M Job flow Diagram and Implemented.
  • Have prime responsibility for Quality reviews of code/design specs and act as a gate keeper before the code/design reaches the client
  • Carried out Testing of Jobs in parallel environments.
  • Carried out End to End testing of Data Migration with Automation Tool.
  • Documentation of Data Migration Automation.
  • Designed separate Generic job for handling Junk characters in case of any migration failure.

Environment: IBM Datastage v8.5, Teradata SQL Assistant, MSTFS, Putty, Control-M scheduler, DB2 V9, Teradata, UNIX Shell Scripting

Confidential

Developer

Responsibilities:

  • Run the Data stage Jobs to migrate the data from DB2 to Teradata Holding Area.
  • Done Testing (spot check, Row Count) on that Data in both Teradata and DB2.
  • Analyzed the problems if any Data stage jobs got failed.
  • Analyzed the problem for the tables, those who were having the row count mismatch between Teradata and DB2.
  • Done Analysis for the missing tables inall subject areas.
  • Done analysis for missing job’s for the table generation by using the control-m Flow.
  • Identified Junk character issue between DB2 and Teradata and fixed the issue with generic job.
  • Analyzed the Data stage jobs and gather information about different Subject areas like Stores, Finance, Target.com and SHO etc.
  • Analyzed the UNIX shell scripts and gather information about different Subject areas like Stores, Finance, Target.com and SHO etc.
  • Developed some UNIX Generic scripts for capturing the information related to all datastage jobs with the purpose.
  • Developed some UNIX scripts for capturing the runs logs for the given project over a period of time.
  • Developed Gold List for the Tables those are found in our analysis belongs to all subject area’s
  • Analyzed the UNIX shell scripts and gather information about different Subject areas like 'Inventory Mart, Inventory relief and FMA etc.
  • Monitoring of Data stage jobs, Scheduled in Control M and report the failure jobs
  • Responsible for creating the Deliverable units to migrate the objects into System testing, Functional verification testing environments.

Environment: IBM Datastage v8.5, Teradata SQL Assistant, MSTFS, Putty, Control-M scheduler, DB2 V9, Oracle, Teradata, UNIX Shell Scripting

We'd love your feedback!