We provide IT Staff Augmentation Services!

Business Analyst Resume


  • 14+ years of work experience in Data Base Development, Implementation, Customization, and Data Migration (ETL Process - Loading millions of records by using PL/SQL procedures with optimize techniques).
  • Excellent working experience on Data Bases (OLTP), Data warehouses (OLAP) and Data Marts(OLAP).
  • Worked as Oracle SQL, PL/SQL Developer, Data Analyst, Business Analyst, Data Base Administrator, and ETL Developer.
  • 2+ years of experience as a Data Base Administrator.
  • 3+ years of experience as an Informatica Developer.
  • Functional experience includes Banking/Financial, Telecommunications, Inventory Management Systems, Operational Support Systems (OSS), Asset Management, and Clinical Trials.
  • Experience in integration of different data base systems, Master data Management, controlling the data flow, identifying performance bottle necks and solving critical performance issues.
  • Strong Skills on PostgreSQL (9.X), MS SqlServer (2008 R2, 2012), MySql (5.X).
  • Excellent work experience in Stored Procedures, Functions, Packages, Triggers, Materialized Views, Data Base Objects, Collections, Records, Ref Cursors, Dynamic SQL etc
  • Excellent work experience in Oracle Installations (Windows, UNIX), Applying Patch sets, Backup (HOT, COLD)/Recovery, RMAN backups, Flash Back Recovery, Logical Backup (IMP/EXP), Logical Backup by using Data Pump (IMPDP/EXPDP).
  • Extensive experience in handling Large Objects (LOB’s).
  • Have extensively worked in developing ETL program for supporting Data Extraction, transformations and loading using Informatica Power Center.
  • Created UNIX shell scripts to run the Informatica workflows and controlling the ETL flow.
  • Experience in writing complex SQL queries, fair understanding about OLTP and OLAP systems.
  • Experience in writing analytical queries Rank, Dense rank, row number, group by rollup, group by cube, Connect by prior.
  • Creating Views, Materialized views using force, fast refresh options, usage of Materialized views logs. Scheduling Materialized views.
  • Experience in solving Mutating table error while using Triggers.
  • Experience in using DBMS STATS, DBMS SQL, UTL FILE Oracle supplied Packages.
  • Experience in Bulk collections, insertions, bulk binding, saves exceptions, pipe line functions.
  • Experience in Performance tuning, Nested Loop Joins, Hash Joins, Sort Merge Joins, Indexes and usage of indexes, B+tree, Bitmap, function based indexes. Using Hints to improve query performance. Parallel Query execution.
  • Processing XML data from webservices/middleware to Oracle Tables and from Oracle tables to webservices.
  • Experience in SQL * loader, complex data loading from flat files/xml data to Oracle DB, External tables.
  • Experience in Complex Report Writing, Automated Report Generation in UNIX environment.
  • Excellent experience in Data Modeling, Performance Tuning, and SQL queries optimization, Execution Plan, TKProf utility, Gathering Table Statistics, AWR reports.
  • Experience in Table Partitioning (LIST, RANGE, HASH, and Composite), Exchange Partitions and Create & Maintain table spaces. Compression of tables..
  • Experience in Data Migration from flat files to Oracle Data base (schema) by using SQL* Loader, from those tables to Cramer Tables according to the Data/Column format Specified by Cramer (ETL Process by using PLSQL Stored Procedures, functions and packages).
  • Involved in gathering the requirements from client, leading teams.
  • Create Unit Test Cases and code reviews. Knowledge in usage of HP quality center.
  • Having good inter-personal, analytical and communication skills


Languages: C, C++, Core Java

RDBMS/Databases: Oracle 8, 9i, 10g, 11g,12C, PostgreSql, MS Sqlserver (2008 R2, 2012), MySql(5.X)

Database Programming: PL/SQL, T-SQL

Structure Query Language: ANSI SQL, Oracle’s SQL, Postgre’s SQL, MS Sqlserver’s SQL, MySql’s SQL

Database GUI tools: TOAD, PLSQL Developer, MSSS DB Management Studio, PG Admin, MySql Work Bench, Navicat for MySql.

Data Modeling Tools: ERWIN (9.X), TOAD, Microsoft Visio.

Data Warehousing: Informatica (8.X, 9.X)

Microsoft Tools: Excel, Word, Power Point, Access, and Outlook.

Scheduling Tools: Auto Sys, Control M, Cron Jobs.

Oracle Utilities: SQL Loader, External Tables, Spool, UTL FILE.

Source Control: Rational Clear Case, Subversion, CVS, ClearCase, GIT, CI/CD

Methodologies: Waterfall, Agile

Reporting tools: SAP’s Business Objects, Tableau.

Markup Languages: XML, HTML, XHTML, CSS

Other Tools: Putty, WinSCP, JIRA, Note Pad ++, Edit PLUS,Talisma



Business Analyst


  • Written PL/SQL Scripts to get and process the data from user uploaded excel sheets
  • Prepared SQL loader scripts to load the data from upstream PECS.
  • Written ETL process By using PL/SQL procedures and functions.
  • Data modelling to maintain relationship among product types, ADAC account numbers and amount types.
  • Written code to process huge data by using bulk collections and insertions to best utilization of CPU, Memory and I/O.
  • Create/Modify series of Control M jobs.
  • Written Unit test cases for CI/CD process.


Business Analyst


  • Written Unix and PL/SQL Scripts to generate flat files from US regulatory data mart to Axiom for IHC/CCAR reports.
  • Prepared SQL loader scripts to load the data from Upstream to US regulatory data mart.
  • Processing Large objects LOBS from web to data base (Oracle).
  • Written ETL process By using PL/SQL procedures and functions to load the data from upstream (STAR Data Warehouse) and other sources to US regulatory data mart.
  • Logical/Physical Design/Model of new reporting related table structures for PACE projections feeds.
  • Code optimization and performance tuning for long running queries.
  • Create/Modify Control M jobs which internally triggers unix scripts to load data, execute regulatory reporting methodology, transfer files etc
  • Production support for the existing reporting activity.

Hire Now