We provide IT Staff Augmentation Services!

Sr. Software Engineer Resume

PennsylvaniA

SUMMARY

  • With 10+ years of experience in Software Development Life Cycle with key emphasis on trending Data Warehousing (ETL), Data Migration, Cloud Technologies, and Business Intelligence (reporting) at both offshore and onshore locations with prominent experience in Banking, Financial, Health Care and Insurance domains.
  • Has been involved in various development, maintenance and enhancement project with CI/CD and DevOps processes & tools.
  • Certified in Information Technology Infrastructure Library for foundation level and Agile Scrum Master.
  • Proven Experience in Data Mapping. Built Source to Confidential Mapping (STM) documents for various projects which involve multiple data sources from different databases including DB2, Oracle, SQL Server, Mainframe DB, XML, and web services.
  • Strong Experience in designing complex and re - usable jobs to implement slowly changing dimensions using Shared Container and Change data capture.
  • Experience in Web Service Implementation through ETL Process
  • Extensive experience in loading high volume data, and performance tuning.
  • Involved in creating dashboards, data quality and reconciliation reports using Tableau reporting tool.
  • Implemented scrum/agile development methodologies/practices according to the application requirements. Understanding of business process and data models.
  • Hands-on experience with Tracking and collaboration tools like Confluence, Slack, JIRA, Bugzilla. Also configure and administer the tools to meet project needs.
  • Proficient in various EDW Environments like Telecommunication, Healthcare Domain, Finance and Retail Sector.
  • Hands on experience with project migration from DataStage lower end (11.5 ver) to higher (11.7 ver).
  • Participated actively in Server jobs upgrade by using Connector migration tool.
  • Involved in Disaster Recovery plan preparation, execution, reporting and enhancements.
  • Extensively worked on Jenkins CI/CD pipeline jobs for end-to-end automation to build, test and deliver artifacts and Troubleshoot build issues during the Jenkins build process.
  • Involved in AWS POC’s.S3 Bucket implementation in IBM Infosphere 11.7 ver.
  • Implemented continuous integration & deployment (CICD) through DevOps (Bamboo, Jenkins, Puppet, Maven,Git, Bitbucket).
  • Loaded raw data intoSpark Data Frames from AWS-S3(Simple Storage Service)and doin-memorydata Computation to generate Output using libraries inSparkfor transformations and actionsusingPythonas programming languages.
  • Agile practice usingJira, Ralleyandversion onefor story program tracking andConfluencefor Content management.

TECHNICAL SKILLS

BI/ETL Tools: IBM Infosphere DataStage and Quality Stage 8.x 9.x 11.x, MicroStrategy, Tableau, Bigdata Concepts (Hive, Sqoop, HDFS).

Databases: Microsoft SQL Server, Db2, Netezza, Oracle, Snowflake.

Scheduling Tools: Active batch, BMC Control-M & Tivoli Workload Scheduler.

Cloud Concepts: AWS Concepts (API Gateway, S3, Lambda, Glue ETL, Cloud Watch, AWS Data Sync, Amazon SNS).

Programming/IDE’s: PyCharm, Jupiter Notebook, Node JS, Python.

Tools: Jenkins, Nexus, Bit Bucket, Bamboo, Ansible, Confluence, GitHub, SVN, Jira, JD Edwards.

PROFESSIONAL EXPERIENCE

Confidential, Pennsylvania

Sr. Software Engineer

Responsibilities:

  • In the short term, Developed and support the Daily and Monthly DataStage jobs in the ITRC Dashboard.
  • Used Hierarchical Stage to extract service now. Json data (Incident, Change & Problem) API’s.
  • Successfully migrated DataStage application from 11.5 to 11.7 Ver.
  • Used Change Data Capture stage in the DataStage to maintain historical datasets.
  • Handled large volume data sets in the DataStage daily jobs for the Qualys Vulnerabilities dial.
  • Worked on FTP to FTPS Unix script conversion process.
  • Involved in Dell Boomi POC’s on RDB ETL migration and future developments of current legacy tools to AWS Modernization.
  • Publishing Sql Views to Prod Tableau server for the new dials in ITRC Dashboard.
  • Used deployment tools to deploy DataStage code into production environment (Blade logic, Ansible, Nexus, Jenkins, Bitbucket, Bamboo)
  • Participated in POC projects, Risk Council meetings and performs other duties as assigned.
  • Supports other technologies tools as needed (Angular JS) for ITRC Webapp maintenance.
  • Extensively worked in NGA/NGSA process for deploying applications.
  • Worked on S3 bucket Poc’s in the DataStage 11.7 Ver, Reading/Writing data to S3 bucket using Amazon S3 connector stage.
  • Worked on Node JS to implement ITRC enhancements on the ITRC webpage.
  • As part of controls Automation, I am working on one of the Control Service POC (VGCP-00143) which are Remedy tool (Ctrl-M), Archer & Service Now data is the sources.
  • Involved in Data Lake meetings for Risk Control Automation use cases.
  • Worked on Continuous delivery pipelines (Bamboo, Jenkins, Git, Bitbucket) Process to implement changes in AWS DEV/ENG regions.
  • Good Knowledge on Amazon Web Services (AWS) Cloud services such as Amazon S3, API Gateway, Cloud Watch, Lambda, Glue.
  • Worked on Data-Lake Pipelines with Service Now Teams on AWS Secret Key manager.
  • Created Python Scripts to extract Service Now API data and load it to S3 Raw Bucket.

Environment: IBM Infosphere DataStage 11.5 & 11.7 Ver, Microsoft Sql server, Tableau, BMC Ctrl-M, Unix, Ansible, Nexus, Jenkins, Bitbucket, Bamboo, Jira, PyCharm, Python Numpy & Pandas, Aws (S3, Lambda, Glue, Amazon API Gateway).

Confidential, Cincinnati-Ohio

Sr. Datastage Developer

Responsibilities:

  • Acted as a Release manager and working closely with the offshore team as an onsite coordinator.
  • Collaborated with Business users and Subject Matter Experts (SMEs) to understand the business requirements.
  • Migrated sub projects successfully in the Investment Advisor from 8.1 to 9.1 Ver & 9.1 to 11.5 Ver.
  • Executed proof of concepts to understand the impacts to older server jobs in 9.1 Ver.
  • Involved in analyzing the quality of the jobs developed by the team members and providing the suggestions to improve the performance and did the performance tuning.
  • Used Web services stage in one of the E-Money sub-Project, Developed Web service jobs for hitting the WSDL operations to get the services through Request and response process.
  • Experienced in using various Hadoop infrastructures such asMap Reduce,Hive,Sqoop.
  • Imported WSDL file definitions using import Web service File Definitions utility in DataStage Designer.
  • Cleansing the extracts using the quality stages in DataStage.
  • Parsing Json and Xml files using Hierarchical stage.
  • Used to receive Structured and Un-Structured Source (.XML, .DAT, .TXT, .XLS) files from Upstream and parse to Downstream using FTP stages.
  • Used Change Data Capture stages for maintaining History data.
  • Used Aggregator, Sort, and Merge and Dataset stages in parallel extenders to achieve better job performance.
  • Ability to work effectively and efficiently in a team and individually with professional conduct and excellent interpersonal and technical communication skill set. Handling CRQ requests and managing calls on weekly basis.
  • Exception handling in the ETL process based on the business rules
  • In Production activities we will have manual process, extensively worked in automating the CIP process from manual to auto.
  • Involved in Daily Standup meeting & Onsite Coordination Meetings & Business Analyst Functional Specification meetings on day-to-day basis.
  • Worked on process documents for internal and external audit and used Data Rule stage to Audit and Validate the data from Source and ETL data for comprehensive and complete result

Environment: IBM DataStage 9.1, MicroStrategy 9.1.3, Netezza, DB2 Database,Tivoli Job Schedular, UNIX.

Confidential

Team Lead

Responsibilities:

  • Understand the requirements and document their expectations, handling the current process, modifying and created the jobs to the updated requirements, handle the load process to data marts.
  • Involved in discussion with Business team for scrutinizing the requirements for extracting the data from the source systems.
  • Datastage migrating jobs from 9.1 to 11.5 version.
  • Worked closely with the ETL / DB Admin team to ensure that all Database / ETL objects and components are correctly migrated for the production deployment.
  • Implemented quality checks by using quality stages.
  • Performed release management planning to ensure release are scheduling inline of existing process and the business deadlines.
  • Worked performance tuning at various levels of Confidential, Source for large datasets.
  • Guiding peers and reviewing their code.
  • Providing end to end support for UAT (User Acceptance Test) and production deployment.
  • Tuned Data stage jobs for better performance by creating DataStage Hash files for staging the data and lookups.
  • Performance tuning of SQL queries consisting of many tables with large amount ofdata.
  • Designed conceptual data model based on the requirements and interacted with non-technical end users to understand the business logic.

Environment: Datastage 9.1, Oracle, PostgreSQL, UNIX, Visio.

Confidential

Data Analyst

Responsibilities:

  • Developing DataStage parallel jobs, sequence jobs, routines as per the requirement.
  • Interaction with client to understand the business requirements and develop the jobs accordingly.
  • Gathering Business requirements and done the modification in the jobs accordingly.
  • Standardized job parameters, job flows and audit process to meet the Design Standards.
  • Involved in Post Go-Live Support activities.
  • Prepared Design Documents for project Go-Live and shared Knowledge Transfer to Production support team.
  • Expertise in various phases of Big Data analytics life cycle i.e.,Data Collection, Data Ingestion,Data Processing,Data Storage,Data Query,Data Visualization.
  • Performed Root Cause Analysis, ETL code fix /data fix for Data issues raised by the Business analyst Participated in reviews of data modeling and business requirement analysis and assisted with defining requirements
  • Successfully implemented pipeline and partitioning parallelism techniques and ensured load balancing of data.
  • Used the ETL Data Stage Director to schedule running the solution, testing and debugging its components & monitor the resulting executable versions.
  • Involved in Importing Metadata from Oracle, DB2 Databases. Used CDC Stage and CA to load Type 2 Dimensional Tables. Used Data Stage Designer to develop DataStage jobs, scheduled the jobs through DataStage director. Tuning of Data Stage Jobs for better query performance.
  • Experienced in using various Hadoop infrastructures such asMap Reduce,Hive,Sqoop, andOozie.

Environment: DataStage 9.1, MicroStrategy 9.1.3, Big Data, UNIX.

Confidential

Etl Consultant

Responsibilities:

  • Worked on the Design, Build, Unit Test, Implementation of the NPS
  • Worked on the Change Request which was raised by clients.
  • Creating IBM DataStage Parallel Jobs to extract, transformation and loading of the data using DataStage design toll in the parallel processing mode.
  • Extensively used Parallel Extender to load data into data warehouse with different techniques like Pipeline and Partition in SMP environment.
  • Responsible for UNIT, System, and Integration testing. Developed Test scripts, Test plan and Test Data. Participated in UAT (User Acceptance Testing).
  • Involved in data stage mapping, data profiling and batch processing
  • Data profiling based on the business requirements.
  • Gathering Business requirements and working on mapping specs.
  • Enhancing the existing jobs and testing in the Quality Assurance environment.
  • Implemented multi-node declaration using configuration file (APT Config File) for performance enhancement.
  • Involved in data stage mapping, data profiling and batch processing.
  • Worked with the ETL Architect and technical teams on systems performance and other maintenance issues.

Environment: Data stage 8.1, Teradata, UNIX, Oracle database

Confidential

Data Analyst

Responsibilities:

  • Developed DataStage Job according to the approved functional Specification as per the client requirement.
  • Used Technical Transformation document to design and build the extraction, transformation loading modules and followed the various naming specifications.
  • Used Data Rule stage to Audit and Validate the data from Source and ETL data for comprehensive and complete result
  • Developed parallel jobs using stages like Dataset, Sequential file, Aggregator, Merge, Transformer, Lookup, joins, Pivot Enterprise, peak stages, and Database’s like Oracle.
  • Testing the Jobs to assure that data is loaded according to the functional specification as per the client requirement.
  • Involved in Unit Testing and preparing the Unit test cases.
  • Running, monitoring of the jobs using Data Stage Director and checking logs
  • Tuning jobs for better performances.
  • Understand the testing effort by analyzing the requirements of project.
  • Document, implement, monitor, and enforce all processes for testing as per standards defined by the organization. implementation services for Software Configuration Management (SCM) using IBM Rational Clear Quest and Clear Case.
  • Implement / Review the Unit Test Cases documents.
  • Worked on the Code Release documents/Code Backup process on daily bases.
  • Code check-in/Checkout process by running Ctrl-M Scripts.
  • Organize the status meetings and send the Status Report (Daily, weekly etc.) to the Client.
  • Scheduled batch jobs in AutoSys. Coordinated with System Operators to Schedule the batch jobs in AutoSys. Provided Level 3-production support
  • Used Version Control for data stage to track the changes made to the data stage project components and for protecting jobs by making read only.
  • Designed Documents for performing unit testing and string testing of developed code.

Environment: Data stage 8.5, SqlServer-2008, AutoSys, Clear Quest.

Confidential

Data migration Analyst

Responsibilities:

  • Worked on the Job Design, Build, Unit Test, and Implementation of the Confidential Migration.
  • Worked on the Data stage job schedulers
  • Created Test Data and Developed Test Scripts for the project.
  • Tested the UNIX Confidential files whether the data is populated correctly as per client requirements.
  • Worked on the Change Request which was raised by clients.
  • Created jobs to extract the source from different sources.
  • Created Jobs using various stages in data stage Designer.
  • Analysis of Existing Source and Confidential System
  • Analyzed the performance of the jobs and enhanced their performance.
  • Designed the job using the stages like Aggregator, Transformer, Look-up, Filter, Remove Duplicate, Copy, Dataset, and Sequential file set.
  • Extracteddatafrom sources like Oracle, Flat Files and HDFS files.
  • Involved in low level design and developed various jobs and performeddataloads and transformations using different stages of DataStage and pre-built routines and functions.
  • Developed DataStage jobs using star schema and followed development standard into different projects.
  • Verification of functional specifications and review of deliverables.

Environment: DataStage 8.1, SqlServer-2008, Oracle 9i, PL/SQL, Erwin Tool 3.5.2, Shell Scripting, UNIX

Confidential

Test Analyst

Responsibilities:

  • Communicate with Onsite team to get the functionality updates and daily status reports.
  • Experience in working with MicroStrategy Security model including Users, Groups, Security Roles and Security filters.
  • Perform quality assurance by unit testing for all reports.
  • Involved in report tuning, caching, specifying query governing thresholds, and report Optimization.
  • Provide production support for Microstrategy report users.
  • Perform detailed data analysis, validation, and identify data quality issues to ensure consistency across all environments.
  • Create highly interactive Microstrategy dashboards by using Visual Insights that replaced legacy reports.
  • Created schema, metrics, reports, and documents supporting client business goals.
  • Improved performance with Intelligence Cubes.
  • Delivered visualizations and Widgets including Bubble Grids, ESRI Maps, Heat maps, Time Series, Waterfall graphs, Microcharts, and Scatterplots.
  • Developed SQL queries to meet reporting requirements.

Environment: Microstrategy Web 8.1.1, DB2 Database, Hp Quality center

Confidential,Texas

Software Engineer

Responsibilities:

  • Worked as a developer from onsite to Co-coordinate with offshore team to implement the rollout process in Production environment.
  • In upgrading technology Implemented Pilot project in individual restaurant.
  • Handled the issues and getting them resolved with in the time based on the seniority so that the client business was not affected.
  • Developing parallel jobs Using stages like Dataset, Sequential file, Aggregator,
  • Merge, Transformer, Lookup, Joins etc.
  • Running, monitoring of the jobs using Data Stage Director and checking logs in Active batch Scheduler.
  • Involved in testing the developed codes, fixing defects if any and redoing the test and fix upon delivery to the client manager.

Environment: DataStage 8.0, Microstrategy, Sql Server, Oracle, Win JDE Systems

Confidential

Junior Developer

Responsibilities:

  • Involved in preparing technical specifications for WIN JDE.
  • Utilize Edwards to create, test, and deploy a variety of reports, interactive applications, and batch applications.
  • Set up user roles, Allowed Actions and Transfer Activity Rules within Object Management Workbench
  • Perform package builds and deploys for full and update packages.
  • Actively involved in microstrategy Report scheduling for monthly Metrix reports.

Environment: JD Edwards, Oracle DB.

Hire Now