- Languages: C/C , SQL, SQL Plus, PL/SQL, Websphere Messaging and Queuing MQSeries 6.0/ 5.3 and WMQI Websphere MQ Integrator v2.1 , WBI-MB v6/5, JMS, Visual Basic 6.0
- RDBMS: Oracle 11g / 10g / 9i / 8x / 7, Oracle EBS R12, DB2, SQL Server 2008, MS-Access
- Modeling Tools: ERWin 4.0, SQL Power Suite/ SQL Power Architect, Rational Rose, DB Designer
- Scripting: UNIX Shell Scripts, PERL, Lotus Notes Script
- Operating Systems: Windows, UNIX Sun Solaris 9, MS DOS, Mainframe CA-7
- Reporting Tools: iDashboards 6.5, SQL Server Reporting Services, Jasper Reports, iReport, Crystal Reports, Cognos, Oracle Reports 6i/2.5, Business Objects XI
- Version Control Tools: PVCS, SCM, Visual Source Safe, Serena Dimensions
- ETL Tools: Pervasive Cosmos 8.12 and Informatica PowerCenter 7.1.2 / 7.1.1/ 6.2 Source Analyzer, Datawarehouse designer, Mapping Designer, Mapplet, Transformations, Workflow Monitor, Workflow Manager
- Oracle Utilities: SQL Loader, External Tables, TKPROF, Oracle Import/ Export/ DataPump
- Other: Microsoft Project, TOAD, SQL Developer, AutoSys, Remedy User 7.5 / 6.03, J2EE - JSP, Servlets, MySQL, PERL, Web Services, Oracle Enterprise Manager, Python, MongoDB
Oracle Datawarehouse Lead
Technologies: Oracle 11g /10g /9i/8i, Oracle Applications R12, HP-Unix, TOAD, ERWin/ SQL Power Architect, MS SQL Server 2008, iDashboards, Oracle Reports 6i The project comprises of designing and developing the Reynolds Global Data warehouse using Oracle11g/10g/9i/ 8 and Oracle e-Business Suite R12. Enhance existing interfaces with ALCOA programs to extract, transform and load data from EBS system into Reynolds Global Data warehouse. It includes development of ETL programs to combine SMART Replacement North American Datawarehouse , FAST Replacement European Datawarehouse , PBIS requirements, data from the Legacy systems residing in Europe, South America, Asia Pacific regions. Develop reports using SQL Server reporting services and iDashboards from OLAP cubes Sales, Invoice lines, Financial Report Repository, GL Balances, GL History, GL Interco, GL History built in SQL Server Analysis Services.
Responsibilities Included: Provided technical leadership on the Global Data warehouse project by facilitating requirements capture, design, data mapping and development using Oracle 11g/10g/9i/8i, PL/SQL, SQL Server SSRS, SSIS, SSAS and iDashboards for reporting Led the effort to develop cross mapping of data flow between various external systems, Oracle tables and generated data conversion scripts using SQL Loader, PL/SQL procedures, functions and packages using Oracle collections, Bulk SQL features, Ref Cursors, Analytical functions and developed testing strategies, Korn Shell Scripts. Presented the technical research findings to the team members, stake holders and closely worked with them to exchange the principles and ideas behind the proposed solutions. Provided technical and functional expertise to various development and functional teams that includes developing data strategy to support business needs. Designed, developed and tuned Oracle applications and provided 24/7 on call support Mentored junior team members and led development efforts and worked closely with the team for developing test plans, scripts, tracking the progress and conducting the team status meetings. Scheduled the ETL jobs using the Oracle job scheduler and Oracle Application Server Concurrent Manager to create the Request Sets for the OLAP/ OLTP processes. Built and automated Korn Shell scripts to FTP and load delimited flat files from external sources to Reynolds Global Datawarehouse staging and dimension tables using SQL Loader. Used analytic functions, hierarchical functions and Oracle regular expressions to transform and calculate aggregations / averages of Total Amount / Net Savings / Amount Due / revenue by operating unit / business unit / customer group Global, High Level Customers and customer groups / Market codes. Set up Materialized Views/Snapshots for data replication across databases. Created B-Tree indexes for faster retrieval of data. Tuned long running SQL queries using Indexes and CBO hints, explain plan, TKPROF and Auto trace. Used DBMS STATS package to gather table and index statistics. Generated DDL scripts for creation of new database objects like tables, views, indexes, synonyms, sequences, roles and grants for code migrations. Used open source SQL Power Architect data model tool to reverse engineer the database and create ERD diagrams for system. Developed stored procedures using Ref Cursors to return a large result set to the SQL Server reports. Responsible for design of logical and physical model of the Datawarehouse and created partitioned tables and partitioned indexes for manageability and scalability of the application. Established data processes and ETL routines for a new automated supply chain management system. Administered user access to iDashboards and used Multi Dimensional eXpressions language MDX to query and manipulate the multidimensional data from the OLAP cube to deliver performance management, scorecards and business alerts in iDashboard reports. Used Pivoting functions in MS-Access to build a prototype for the business users before building the drill-down reports in idashboards and SQL Server Reporting Services. Developed and enhanced Oracle Reports 6i for a few projects.
Senior Oracle PL/SQL/Datawarehouse Developer
Technologies: Oracle 9i/10g, Teradata, Unix, SQL Loader, TOAD, SQL Developer, Job Control, SQL Power Architect, DBDesigner, ERWin, EAI Transfer/WinSCP/FileZilla for File Transfer Process, Business Objects XI/Crystal Reports, Golden Gate XOHM Data Warehouse XDW is a strategic, enterprise-wide resource that enables the business to generate reports, analyze data, and perform predictive modeling in support of the XOHM service. XDW supports XOHM by creating a single repository for all 4G/WiMax related data. XDW sources its data directly from both Amdocs and Sprint 4G application platforms like Enabler, Clarify, OMS, OM/IM and OMA/DM. Data-marts and Business Objects universes enable canned and ad-hoc reporting across the various XOHM business functions. They feed information into Executive Dashboards, Business KPIs, and Standardized Operating Metrics and Predictive Analytics for supporting Sales and Marketing teams. XDW also provides Data feeds to other systems inside Sprint Nextel and outside vendors who provide content or services to XOHM subscribers. It also caters to automated data loads from multiple sources both internal and external to Sprint Nextel/XOHM. XDW utilizes Transactional Data Management technology from Golden Gate Systems to provide continuous, real-time capture and delivery of data from source to target between various 4G systems.
- Developed a Business Intelligence BI and Data Warehouse reporting system for Sprint, to launch their new XOHM 4th generation wireless communication system, a 5 billion project.
- Conducted interviews and application development meetings with technical staff and business users and mapped them to the technical requirements from several business units.
- Led the effort to develop cross mapping of data flow between various external systems, Oracle tables and generated data conversion scripts using SQL Loader, PL/SQL scripts in the Unix environment and developed testing strategies for the Sprint's XDW database.
- Led the efforts to provide technical evaluations involving various methods of implementing BI solutions using software packages and tools such as Oracle, Business Objects, Open Source Data Model tools such as SQL Power Architect and DBDesigner to reverse engineer the database to create ERD diagrams for the source systems, Open Source BI tools such as Pentaho. Results of these evaluations were presented to the stakeholder management in oral and written forms of communication.
- Defined, developed and documented BO XI/ Crystal reports and dashboards along with the Business Analysts, with key performance indicators and user interactive features, for implementation in the Business Objects automated reporting system.
Technologies: Oracle 9i/10g, Data Junction 7.5.5, Pervasive Cosmos 8.12/8.14, Informatica 7.1.2, SQL Loader, XML, TOAD, SQL Developer, SCM, Mercury Quality Center Provided technical expertise by supporting architecture, design, development and implementation of Electronic Data Interchange projects. Multiplan receives claims for re-pricing from various clients like Cigna, Aetna, United Health Care, Humana, and FirstHealth in the file format appropriate to each client's implementation. These files are decrypted as appropriate, loaded into Multiplan's proprietary EDP claim transaction system tables.
- Designed and developed Healthcare data applications on the Electronic Data Interchange EDI team using Oracle PL/SQL and Data Junction mapping product/Cosmos Pervasive/Informatica 7.1.2 ETL Tools to extract, transform and load claims data into the database based on the business specifications.
- Installed and configured the Informatica client and Informatica Server.
- Worked with tables having 50 million plus records per table and used complex data transformations consisting of approximately 15 transformations per mapping in the mapplets. Used various transformations like lookup, update strategy, router, filter, sequence generator, source qualifier/Joiner on data and extracted according to the business rules and technical specifications. Extensively used unconnected lookup transformations to get decode values for the code list fields.
- Improved the performance of the mappings by moving the filter transformations early into the transformation pipeline, performing the filtering at Source Qualifier level for relational databases and selecting the table with fewer rows as master table in joiner transformations.
- Developed packages, procedures, functions using Oracle's collections and BULK SQL features to implement new changes used in EDI processes to improve the performance. Used analytic functions to calculate aggregations/averages of Total Amount/Net Savings/Amount Due/revenue by client/ total number of re-priced claims per client, etc.
- Developed Shell Scripts to automate daily activities such as loading files into database using SQL loader, FTP files to or from remote servers, Encrypting and Decrypting files using the PGP software, compression and decompression of files, executing SQL scripts, executing PL/SQL procedures, running jobs in parallel job scheduling, spooling CSV files using Automate 220.127.116.11/18.104.22.168.
- Evaluated and provided comparative analysis of Pervasive, Pentaho and Informatica tools to the client. Also, evaluated the iReport and Jasper Reports which is a part of the Jaspersoft BI Suite for designing and generating Reports.
Senior Programmer Analyst