We provide IT Staff Augmentation Services!

Senior Application Developer Resume

5.00/5 (Submit Your Rating)

Boston, MA

SUMMARY

  • 10 Years of Enriching Software Development Experience in Analysis, Design, Development and Production/Maintenance of Data Warehousing Applications.
  • Thorough knowledge of the Data Mart Development Life Cycle. Performed all dimensions of development including Extraction, Transformation and Loading (ETL) data from various sources into Data Warehouses.
  • Created and configured management reports and dashboards. Managed and maintained use cases into correlation systems. Developed, evaluated and documented specific metrics for stakeholders / groups / management. Resolved configuration based issues in coordination with infrastructure support teams.
  • Performed all dimensions of development including Extraction, Transformation and Loading (ETL) data from various sources into Data Warehouses and Data Marts using Power Centre (Repository Manager, Designer, Server Manager, Workflow Manager, Workflow Monitor).
  • Designed multi - dimensional models (Star, Snow Flake schema). Experience in Data Modelling, Business Process Analysis and Reengineering.
  • Involved in Development of Complex mappings from varied transformation logics like Unconnected /Connected lookups, Router, Filter, Expression, Aggregator, Joiner, Union, Update Strategy and more.
  • Experience in Data Extraction from multiple operational sources of loading staging area, Data Warehouse and data marts using CDC/ SCD (Type1/Type2) loads.
  • Extensive experience in Oracle 10g/11g/12c and scripting skills in multiple languages including Perl, AWK, SED and basic Shell Scripting. Automating various jobs and performance processes tasks by creating scripts and cron jobs. Worked on Oracle PL/SQL programming including development and tuning of applications, packages, stored procedures, functions and Unix/Linux Shell scripting.
  • Experience on Oracle Real Applications Clusters (RAC), support of mission-critical, high availability and highly scalable database systems. Experienced in working with data warehouse installations and knowledge of managing dimensional data, OracleSQL tuning, Storage Management.
  • Proficient with the most recent versions of Cognos and associated applications used in generating reports for different organizational requirements.
  • Worked on ETL processes using Informatica 9.0,IBM infosphere Datastage 8.5,11.5.
  • Installed, tested and deployed monitoring solutions with Splunk services. Implemented forwarder configuration, search heads and indexing.
  • Experience in Data Analysis, RDMS, OLTP, OLAP and MS SQL Server in writing T-SQL,SSIS, SSRS.
  • Prepare, arrange and test Splunk search strings, complex and operational strings, and provide technical services to projects, user requests and data queries.
  • Expertise in Data Warehousing Concepts like Fact Table, Dimension Table, Logical Data Modeling, Physical Modeling and Experienced in data modeling projects using star schema and Erwin tool.
  • Develop specifications, source to target mapping and other documentation required for development. Implement adequate monitoring, validations, data integrity, data modelling and auditing to accurately measure and trend ETL performance. Fine tune complex queries against very large databases (VLDBs).
  • Experience in Performance Tuning of sources, targets, mappings, transformations and sessions, by implementing various techniques like partitioning techniques and pushdown optimization, and also identifying performance bottlenecks.
  • Exposure to Finance and banking sectors with regards to data warehouse development.
  • Superior communication skills, strong decision making skills, organizational skills, and customer service oriented. Excellent analytical and problem solving skills, client relations and negotiations.
  • Experience in relational database management systems like Microsoft SQL server 2008, T-SQL, SSIS (SQL Server Integration Services fordata integrationandworkflow apps), MySQL, Sybase, Datastage, DB2 and multidimensional database concepts like oracle Essbase Hyperion, OLAP and OLTP tools.
  • Review and fine tune scripts, stored procedures, functions, and triggers related to bug fixes before releasing to production and developed the Views, Synonyms, Functions, Procedures, Packages, Triggers and Cursors.
  • Highly motivated & target oriented team player with a can-do and will-do mind-set who enjoys working with multi-functional teams towards achieving a common goal and as well as an individual contributor.
  • Enthusiastic and quick to adapt new technologies that helps to enhance and give the best at work. Proficient in documenting all day-to-day administration procedures, change controls & support activities.
  • Experienced in handling multiple projects simultaneously and effectively.

TECHNICAL SKILLS

Methodologies: Agile and Water Fall

Operating Systems: Windows Server, Solaris, Red-Hat Linux, Cygwin

Cloud Base Support: VMware, ESX, Hyper-V, AWS EC2, S3, RDS, EBS, VPC, IAM.

Languages: PL/SQL, SQL, RMAN, Shell, Perl, C, Java, JCL, HTML, COBOL

Data warehousing Tools: Informatica Power Center 9x/8x/7x, Informatica, Repository Admin console, ERWINdata modeler,Repository Manager, Designer, Workflow Manager, workflow Monitor.

Backup/Recovery: Rman, User Managed Backup/Recovery.

High Availability: OPS, RAC, CRS, Grid Infrastructure, DataGuard, Replication, Automatic Storage Mgmt ASM

Tuning: Operating system, network, database parameters, SQL and PL/SQL.

Tools: Oracle Enterprise Manager (EM), DBVerify, LogMiner, Tkprof, SQL*Loader, import/export, Data Pump, Automatic DB Diagnostic Monitor, OBIEE 10.1.3.x, IBM Tivoli Monitoring Suite, ITCAM, ITNM, Solarwinds, SCOM, Splunk 6.0.

IT Software: Oracle SQL Server, PL/SQL Developer, TOAD, Putty, WS-FTP Pro, Log Analyzers, Diff utilities, Notepad++, Editplus, MS Word, Excel, Power Point, Oracle golden gate, Outlook, SharePoint, Erwin, ORACLE Designer, ClearQuest.

PROFESSIONAL EXPERIENCE

Confidential, Boston, MA

Senior Application Developer

Environment: ETL, PL/SQL, SQL, Shell/PERL Scripting, Oracle 10g, 11g,12c, Autosys,Putty, Excel, SQL Loader, IBM Infosphere datastage 8.5,11.5, PL/SQL Developer, SQL Navigator, WS FTP Pro, WinSCP tools, SharePoint, ClearQuest.

Responsibilities:

  • Analyzed ETL mapping document requirements for the development of DataStage Extraction, Transformation and Loading (ETL) Jobs to load data from RDR, CMS, MAN, TDR source systems into target risk authority (RAY) schema tables and enable data elements required for RAY calculations.
  • Performance tuning of the mappings and data load.
  • Designed and developed Job information language(JIL’s) for automation scheduling of ETL jobs using AutoSys Scheduler. Handled Full load and partial load via staging tables in the ETL Layer for Daily RWA & monthly Advanced and standard contexts specific to particular reporting dates.
  • Analyse business reporting requirements and translated into analytical applications for Oracle warehouse. Modifying existing loading programming scripts as per change request. Code and Program mappings for ETLs for Oracle Data Warehouse according to client specification.
  • Attended agile daily scrum calls to update the status of user requirements that are being worked on and had involved in iteration planning meetings to gather all user requirements, project scope and performing feasibility analysis.
  • Wrote shell scripts and oracle PL/SQL procedures to load data into the staging table and developed architecture options to address scalability and calculations performance such as load balancing by creating multiple task servers on windows app server.
  • Used PERL scripts for data cleaning and Implemented features including materialized views for better performance of summary tables, Autonomous Transactions, Coding Dynamic SQL Statements.
  • Used data modelling tools like ER-WIN and ER diagrams, Normalization Techniques for designing database with Relational technology.
  • Worked on Basel3 ADV & STD calculation configurations for delivering business critical RAY application reports on time for all pillar3, schedule 101 & HC-R,RC-R reports and responding to the requirements quickly and updating the configuration according to the requirement in the RAY(Risk authority)application.
  • Used ClearCase to control different Versions of the ETL jobs and deployed the configuration scripts through cloud DB deploy and build forge processes under tight deadlines and driving quality practices.
  • Contributing to process improvements to reduce handling time and improve output.
  • Performance Monitoring of Database SQL and designs causing abnormal CPU, Memory consumption using AWR Report and Oracle internal tables.
  • Experience in trouble shooting using various log files.
  • Experienced in Performance Tuning of sources, targets, mappings, transformations and sessions, by implementing various techniques like partitioning techniques and pushdown optimization, and also identifying performance bottlenecks.
  • Worked on data Warehousing Concepts like Fact Table, Dimension Table, Logical Data Modelling, Physical Modelling.
  • Upgraded the ETL servers from 8.5v to11.5v along with migration of code from DEV to QA, UAT, PROD env’s.
  • Involved in technical discussions and facilitated in meeting deadlines.
  • Provided the directions to team to implement unit and regression test plans for Extraction/Transformation.
  • Involved in Configuration management decisions.
  • Extend and maintain existing ETL processes to accommodate new or changed data feeds. Proactively diagnose research, monitor and resolve database-related faults and performance problems including high CPU, memory, blocking, deadlocking and timeouts.
  • Involved in complete SDLC process, through all phases of project life cycle - analysis, design, development, testing, implementation and post-production activities.

Confidential, Charlotte, North Carolina

IT Analyst

Environment: ETL, PL/SQL, SQL, Shell/PERL Scripting Oracle 10g, 11g, Putty, Excel, SQL Loader, Microsoft SQL server management studio 2012,T-SQL,SSIS, SSRS,WS FTP Pro, WinSCP tools, SharePoint, OBIEE 10.1.3.2., IBM Tivoli Monitoring, ITCAM for Applications, ITNM

Responsibilities:

  • CreatedSSISpackages to clean and load data to data warehouse and package to transfer data between OLTP and OLAP databases.
  • CreatedSSISPackages using Pivot Transformation, Lookup, Derived Columns, Condition Split, Term extraction, Aggregate, Execute SQL Task, Data Flow Task, Execute Package Task etc. to generate underlying data for the reports and to export cleaned data from Excel Spreadsheets, Text file, MS Access and CSV files to data ware house.
  • Monitor all systems/processes; escalate and assume ownership of issues as needed to ensure resolution is provided in timely manner. Design and implement application enhancements, customize monitoring solutions, and troubleshoot.
  • Delivering business critical application generate reports on time and responding to the requirements quickly. Receiving the volumes and rates from the Business partners and calculating the costs and make them available in terms of Verification Cross Check Reports.
  • Install and configure performance monitoring utilities such as OEM grid control, configure ASM, create databases, instances and services.
  • Involved in Creating and Administering the Physical Layer, Business Model & Mapping Layer and Presentation Layer using Oracle Business Intelligence Admin tool.
  • Support Oracle RAC Servers and troubleshoot operational tasks including data loads, index builds & backups.
  • Provide technical insight for support teams in the Tivoli Monitoring environment, Troubleshoot and resolve Tivoli Monitoring issues. Participate in implementation and documentation of all monitoring activities.
  • Debugging the scripts and jobs written in UNIX Shell and PERL on Development and UAT Server.
  • Developed different kinds of Reports (pivots, charts, tabular) using global and local Filters.
  • Analyse business requirements and system specifications to understand the application processing’s.
  • Responsible for creating and modifying the PL/SQL Procedures, Functions according to the business requirement.
  • Implemented robust ETL processes to extract transform and load data from transactional databases to the Data marts and operational data stores.
  • Generated SQL and PL/SQL scripts to create and drop database objects including: Tables, Views, and Primary keys, Indexes, Constraints, Packages, Sequences and Synonyms.
  • Developed ETL specifications, source to target mapping and other documentation required for ETL development. Implement adequate ETL monitoring, validations, data integrity, data modelling and auditing to accurately measure and trend ETL performance.
  • Monitored periodic ETL batch application processes and resolved errors on time.
  • Updating the changes of Companies and cost centre information to reflect all over applications. Application maintenance and scheduled job monitor.
  • Integrated real-time data from numerous sources, enabling real-time business intelligence to deliver more targeted, personalized marketing initiatives and campaigns to end customers using Oracle golden gate.
  • Involved in extensive code-walk check and organize the client’s new requirements and tasks.
  • Documenting the daily tasks and updating the share point access repository with new types of issues and resolutions. Prepare daily reports and attend client calls
  • Updating operational processing times, production incident issues and scenario process tracking based on each application processing in SharePoint.
  • Updating Information related to databases, reports, and business applications in SharePoint and to help team members locate the information they need to make good decisions.

Confidential

Informatica and Cognos Report developer

Environment: Oracle 9i/10g/11g, TOAD, Informatica 8.6, Cognos Report Studio, Analysis Studio,Metric studio,Oracle Report Builder, SQL Loader, Oracle SQL* plus, Report Builder (6i & 9i), Edit plus, Excel, OBIEE 10.1.3.2, Oracle BI Applications 7.9.4(Sales Analytics, Order Management Analytics)

Responsibilities:

  • Worked in Informatica 8.6 to perform ETL from various sources to the central data warehouse.
  • Involved in Design and Development.
  • Involved in testing of the various mappings involved in the ETL process.
  • Extensively used ETL to load data from Flat files.
  • Involved in Delimited files and also from the relational database, which was Oracle 10i.
  • Developed and tested all the Informatica mappings.
  • Performance tuning of the mappings and data load.
  • Scheduling the workflows using Informatica scheduler.
  • Involved in Unit testing.
  • Involved in lookups.
  • Involved in update strategies.
  • Prepared and updated mapping documents.
  • Develop reports using Oracle Report Builder and shell scripts for automating report delivery.
  • Performed database tuning using Explain Plan, hints, utlbstat/utlestat, Toad, PowerExplain, and Enterprise Manager.
  • Created complex PL/SQL packages, procedures, functions, Triggers and extensively used PL/SQL tables, cursors, user defined object types and, exception handling.
  • Provided on-call production database support.
  • Developed scripts to monitor all failover jobs and analysed logs for better performance.
  • Installed and configured Cognos 8 BI products, tuning and testing.
  • Designed, developed and modified Cognos reports using Cognos Report Studio and Query Studio.
  • Creation and maintenance of PL/SQL programs and Shell Scripts. Writing ad-hoc queries and change requests queries (CR’s) based on schema knowledge for various reporting requirements.
  • Automating various jobs and performance processes tasks by creating scripts and cronjobs.
  • Performance tuned many queries and taking back up of cronjobs and shell scripts.
  • Debug and optimize SQL queries and stored procedures
  • Guiding the co-team members for solution optimization.
  • Assist in project planning and scheduling
  • Developed and supported for Interactive Tool Reports generation for clients
  • End-to-End Automation of Metasolv, E-POS & CRM, Cognos BI module Reports.
  • Developed shell scripts for Backup/recoveryand management ofData Warehouse.

Confidential

BI consultant

Environment: Oracle 9i/10g/11g, TOAD, Cognos Report Studio, Analysis Studio,Metric studio,Oracle Report Builder, SQL Loader, Oracle SQL* plus, Report Builder (6i & 9i), Edit plus, Excel.

Responsibilities:

  • Developed reports using Cognos Report studio and shell scripts for automating report delivery.
  • Creation and maintenance of PL/SQL programs and Shell Scripts.
  • Writing ad-hoc queries and change requests queries (CR’s) based on schema knowledge for various reporting requirements.
  • Automating various jobs and performance processes tasks by creating scripts and cronjobs.
  • Performance tuned many queries and taking back up of cronjobs and shell scripts.
  • Involved in DBA activities like troubleshoot Oracle reports issues and tuned reports parameters.
  • Coordinated in improving the system performance by increasing the system specifications.
  • Debug and optimize SQL queries and stored procedures
  • Guiding the co-team members for solution optimization.
  • Assist in project planning and scheduling
  • Developed Interactive Tool Reports generation for clients
  • End-to-End Automation of Metasolv, E-POS & CRM module Reports.

Confidential

BI developer

Environment: Oracle 9i, TOAD,Cognos Report Studio, Analysis Studio,Metric studio, SQL Loader, Report Builder (6i & 9i),Editplus,Excel.

Responsibilities:

  • Implemented the data extraction, comparison of data from different Systems like Metasolv, EPOS, HLR, BP, CRM etc. using Cognos BI and finding the discrepancies by comparing the data with different systems taking Metasolv as the base.
  • Finding the route cause analysis for various data discrepancies and fixing the issues in database and clearing the discrepancy data finally. The project involves with below two workflow reconciliations.
  • 1. EPOS-MS-BP-CRM Reconciliation
  • 2. MS-EPOS Reconciliation
  • Developed PL/SQL stored procedures and functions to extract the data from different systems and also code for fixing the route cause of data discrepancies.
  • Automated many PL/SQL programs using shell scripts in such way to get the attached excel sheets of discrepant data to the respective mail Id’s.
  • Attending meetings and client calls with multi-functional environmental teams.
  • Maintaining the dashboard for current status and repository for issues, incidents and documenting day-to-day tasks.

Confidential

PL/SQL developer

Environment: Oracle 11g, PL/SQL, Java, TOAD, SQL Developer, SQL Loader, UNIX Scripting

Responsibilities:

  • Analysis of the project scope, business rules, user requirements and performing feasibility analysis
  • Design and develop the new summary tables based on the reporting requirements
  • Design and develop SQL Loader jobs for getting the data from source systems to application database
  • Design and develop the scripts to perform data validations and data integration checks
  • Preparing SQL queries, PLSQL code to pull data for analysis and reporting
  • Preparing SQL queries to pull data for presenting on WEB.

Confidential

Team Member

Environment: Mainframes, DB2, COBOL, JCL,Oracle,Java,.NET

Responsibilities:

  • Attending sessions on various Software technologies (Programming languages, Operating Systems, Software engineering, TCS proprietary tools etc., Systems Engineering, Requirements Engineering, Testing and Debugging).
  • Performing case studies, assignments on various software technologies.
  • Attending soft skill Session.

We'd love your feedback!