Bi Lead Developer/ Business Systems Analyst Resume
Plano, TX
SUMMARY
- Over 9+ years of IT professional career with expertise in Requirement Gathering, Design, Development, Deploy, Test, Support and Implementation of Data Warehouse applications and Database business systems.
- Created BRD (Business Requirement Document), ETL Design Document, Integration Design Document, Test Strategy, High Level Test Plan and Test Design.
- Conducted UAT for Business Users.
- Experience in writing test plans, test cases, unit testing, system testing, integration testing and functional testing.
- Currently working as ETL Business Analyst cum BI Lead developer.
- Good noledge in business verticals that include Finance, Health Care, Retail & Public Sector.
- Deep noledge in Data mapping the functional units of FACETS to EDW model.
- Good noledge in Financial Residual Value Model.
- Performed Defect root cause analysis.
- Performed Lead Role and guided three different streams test team.
- Expertise in documenting the ETL process, Source to Target mapping specifications, high/low level design document, status reports and meeting minutes.
- Experience in working with various databases which include Oracle, Gemfire XD, Greenplum, PostgreSQL, MS SQL Server and Flat files.
- Experience in all phases of development including Extraction, Transformation and Loading (ETL) data from various sources into Data Warehouses and Data Marts in Agile and waterfall methodology.
- Experience with relational and dimensional models using Facts and Dimensions tables.
- Expert in using Informatica Power Center 9.1/8.6.0/8.1.0/7.1 and Informatica Power Exchange for extraction, transformation and loading mechanism.
- Extensively used Informatica Workflow manager and Workflow monitor for creating and monitoring workflows, Worklets and sessions.
- Adhere to the project timeline to meet all the deliverables.
- Experience in conducting diagnosis/troubleshoot of standard business problems, in line with the current technical and applications architecture.
- Proficient in server side programming like stored procedures, stored functions, database triggers, packages using PL/SQL and SQL.
- Experience in designing the Aggregates required for the BI Reporting tools.
- Very good understanding of ‘Versioning’ concepts and worked with SVN and Informatica versioning. Worked extensively with versioned objects and deployment groups.
- Experienced with Database archiving processes are managed centrally for all data, whether archived on premise or in the cloud.
- Good experience in data analysis, error handling, error remediation and impact analysis.
- Experience in Agile and Waterfall methodologies.
- Versatile team player with excellent analytical, communication and presentation skills.
TECHNICAL SKILLS
ETL/DWH Tools: Informatica Power Center 9.5.1/9.1/8. x/7.1 Teradata 12.0(BTEQ, Fast Load, TPUMP) SSIS Package.
Databases: Oracle 10g, 9i, 8iMS Excel, MS Access, MS SQL Server 2012/2008/2005, DB2, Teradata 14.0/12.0, Greenplum 1.16.1, Gemfire XD.
DBMS/Query Tools: TOAD, Rapid SQL, SQL Developer, WinSQL, SQL Assistant, SQL Navigator, PL/SQL Developer, pgadmin3, SQuirreL SQL Client 3.6.
Operating Systems: Microsoft Windows - Vista, XP, 2000, NT 4.0, OS/2 UNIX - Sun Solaris, HP-UX
Programming Lang: SQL, PL/SQL, Postgre SQL, UNIX Shell Scripting, T-SQL, VB Script, Java, C, C++, C#.
Data Analysis: Data Design/Analysis, Business Analysis, User Requirement Gathering, User Requirement Analysis, Gap Analysis, Data Cleansing, Data Transformations, Data Relationships, Source Systems Analysis and Reporting Analysis.
PROFESSIONAL EXPERIENCE
Confidential, Plano, TX
BI Lead Developer/ Business Systems Analyst
Responsibilities:
- Working as a Business Systems Analyst cum BI Lead and handling multiple projects in TFS.
- Conducting Day to day meeting with Financial and Accounting Business users and understanding their requirements and creating the Business Requirement Document.
- Created the Architectural diagram for the AAD (Additional Accumulated Depreciation) Model using Visio.
- Created the Architectural diagram of AS IS and TO BE for the Lease Cube Migration project using Visio.
- Provided the BRD Walkthrough to all the Stakeholders including Legal team, Compliance team and Information Security team within TFS.
- Clarify all the questions and queries from the Legal team, Compliance team and Information Security related to the new initiative.
- Close follow-ups with Legal, Compliance and Information Security for the BRD Approvals.
- Applaud appreciation from the Legal and Compliance team for well documented BRD.
- Created Data Model to feed the predictive analytics (AAD Model)
- Gathered Requirements, designed, developed and implemented the AAD (Additional Accumulated Depreciation) Model Data Acquisition successfully.
- Gathered Requirements for other parallel projects viz. the AAD Model Integration, RV Settings and Migration of Historical Forecast Data from Legacy system (ESSBASE Lease Cube) to Netezza.
- Closely work with TFS Committee member and conduct meetings to give the walkthrough of the BRD for the approval process.
- Closely working with TFS Architect team and adhering to all TFS BI policies.
- Compliance with all documentation required to get approval and maintain the documents in the shared location.
- Designing the ETL Framework for the development of the AAD Impairment Model - Data Acquisition as a part of RV Forecast.
- Designed complex logic in the stored procedure by following the TFS standards. Everyone liked that idea.
- Closely worked with TFS Architect to establish the Data quality methodology, conduct gap analysis, Design review and Approval.
- Closely worked with TFS Core Committee Team to establish the TFS Process within the AAD Model and Get Approval for the Gate Reviews.
- Assign task to Offshore and give noledge transfer to develop the code.
- Solve issues within the timelines.
- Worked on adhoc requests by clients.
- Analyze the issues in Data and provide solution.
- Conducted Code walkthrough with Architects and get the approvals.
- Conducted PROD FIX walk-though of previous phase and got approval.
- Followed Client’s internal process by creating Priority tickets, RFC’s (Request for Change) ERFC’s (Emergency Request for Change) for the corresponding task.
- Follow up with Business, Development Team, Process Team, Architect Team etc for any pending task.
- Modified ETL Mapping for a Data Acquisition file.
- Write SQL Queries from Netezza and send the query result to SAS Model.
- Create MFT folder requests for the Input files as a part of Data Acquisition
- Attend daily standup meeting and status update.
- Additional Role playing on De-commissioning the current forecast system called ESSBASE CUBE.
- Follow up with user in getting their response and update the user matrix per department.
- Gather the Requirement for the ESSBASE CUBE user to provide same information from the new system.
- Implementation of SOX (Sarbanes Oxley Compliance) FSM (Financial Systems Maintenance) controls on AAD Model.
Environment: Visio, Visual Studio 2010, Netezza, Informatica 9.5.1, Flat Files, CSV files, SAS
Confidential, Jessup, PA
ETL BA/ETL Developer/Build Coordinator
Responsibilities:
- Worked as a Business Analyst cum Developer in the ETL Framework development of MOMs EDW which involves MS SQL Server, Gemfire XD and Greenplum as architecture stack.
- Created the Mapping document for the Greenplum Functions to SSIS Metadata Framework.
- Created Product Back log with the halp of Product owner.
- Created user stories and their acceptance criteria for the Project.
- Certified once the stories are completed before sending to the product owner for review.
- Conducted the collaboration session, with the development team to clarify their doubts.
- Helped in demos during the sprint review.
- Bagged appreciations throughout the entire development phase.
- Exposure to the custom C# package to generate the SSIS Package based on the ETL metadata framework.
- Worked with Architect to establish the data quality methodology, conduct gap analysis and design review.
- Configured the metadata frame work for converting the existing process as is to the new environment for migration.
- Re-Engineered non-performing existing Greenplum functions into a Metadata Framework suitable for SSIS Package migration.
- Written stored procedure to automate the Medicaid/Medicare Id’s conversion in the Metadata framework.
- Integrated FACETS model like member, network provider, claim, member provide and billing.
- Created the conversion tables in MS SQL for maintaining the internal data integrity and publish master key in the system.
- Ensured the successful development and unit testing of the SSIS package to read the data from FACETS, transform and load into the staging table in SQL Server.
- Configured Metadata Framework to load into staging tables and then to Gemfire Tables.
- Written DDL’s, Queries, Created Indexes, Primary Key Constraints and unit testing scripts in Gemfire XD using SQuirreL.
- Worked on Greenplum functions for creating the aggregates for the dashboard reports.
- Wrote Greenplum functions to load data from flat files to dimension and fact tables.
- Involved in Unit testing, Code Reviewing, Moving in SIT.
- Centre point of contact for cross stream touch points involved in MOM's project implementation.
- Participated in the project scrum/status meeting and providing daily status.
- Established best practices standards and ensured adherence to them.
- Documented the design, data flow diagram, unit case document, metadata frame work design document etc.
- Documented the MCRR (Cross stream project) process for the data load.
- Worked in generating 837 I and P XML by customization of SSIS package a metadata tables. In-line with the Healthcare rules and regulation (HIPPA).
- Created mapping documents to outline data flow from sources to targets mapping sheets.
- Involved in deployment activities. Creating and Updating the Metadata DDL/Stage DDL, Meta data DML, Configuration DML etc. and Update the Changes to SVN.
- Followed the process for preparation of Release Notes, Deployment form and co-ordination in getting the Approval for the Deployment from the Client's Manager.
Environment: Visual Studio 2010, MS SQL Server 2012, Visio, Informatica 9.1, pgAdmin 1.16.1 for Greenplum, Gemfire XD, SQuirreL SQL Client 3.6, ADO .net, Flat Files, CSV files, Postgre SQL (Functions).
Confidential
Informatica Developer/ Project Manager
Responsibilities:
- Worked with various Databases for extracting the files and loading them into different databases.
- Designing the ETLs and conducting review meetings.
- Played a lead role in managing the offshore team.
- Worked mainly on troubleshooting the errors which were occurred during the loading process.
- Created stored procedures, triggers, tables, indexes, rules, etc. as needed to support extraction, transformation and load (ETL) processes.
- Designed and developed integrations to export the customer data from the EDW to Greenplum DW.
- Designed and developed integrations to import back the customer scores from flat files generated by Greenplum.
- Created a replication process to keep re-engineered database in sync with legacy database.
- Extensively worked on creating mapping parameters and variables which are used in various workflows for reusability.
- Worked with various active transformations in Informatica Power Center like Filter Transformation, Aggregator Transformation, Joiner Transformation, Rank Transformation, Router Transformation, Sorter Transformation, Source Qualifier, and Update Strategy Transformation.
- Extensively worked with various Passive transformations in Informatica Power Center like Expression Transformation, and Sequence Generator.
- Worked on performance tuning and optimization of the Sessions, Mappings, Sources and Targets.
- Extensively worked with Slowly Changing Dimensions Type1, Type2, for Data Loads.
- Involved in writing Stored Procedures in SQL and extensively used this transformation in writing many scenarios as per the requirement.
- Worked with re-usable objects like Mapplets.
- Extensively worked with aggregate functions like Avg, Min, Max, First, Last in the Aggregator Transformation.
- Extensively used SQL Override function in Source Qualifier Transformation.
- Extensively worked with Incremental Loading using Parameter Files, Mapping Variables, and Mapping Parameters.
- Written Queries, Procedures, created Indexes, Primary Keys and Database testing.
- Defects were analyzed, fixed, tested, tracked and reviewed.
- Implemented various Performance Tuning techniques on Sources, Targets, Mappings, and Workflows.
- Used Source Analyzer and Warehouse Designer to import the source and target database schemas, and the Mapping Designer to map the sources to the target.
- Involved in performance tuning and monitoring (both SQL and Informatica) considering the mapping and session performance issues.
Environment: Informatica Power Center 9.1, Oracle 10g, Netezza 6.0.8, Web services, MS Access 2010, SQL*Loader, UNIX, Winscp, Putty, SQL Developer, PL/SQL.
Confidential
ETL Informatica Developer
Responsibilities:
- Worked with Business analysts to understand business/system requirements in order to transform business requirements into TEMPeffective technology solutions by creating Technical Specifications (Source to Target Documents) for the ETL from the Functional Specifications.
- Worked mostly on Lookup, Aggregator, and Expression Transformations to implement complex logics while coding a Mapping.
- Worked with Memory cache for static and dynamic cache for the better throughput of sessions containing Rank, Lookup, Joiner, Sorter and Aggregator transformations.
- Monitoring the Jobs, observing the performance of the individual transformations, optimizing and tuning the mappings (Performance Tuning) to achieve higher response times.
- Migrating objects between different Environments using XML Export/Import (using Repository Manager).
- Developed code to load the data from Flat File to stage and stage to ODS.
- Developed mappings with XML as target and formatting the target data according to the requirement.
- Developed design specifications following the data warehousing concepts.
- Created mappings to incorporate Incremental loads.
- Developed reusable Mapplets to include the Audit rules.
- Working with Power Center Versioning (Check-in, Check-out), Querying to retrieve specific objects, maintaining the history of objects.
- Designed the Workflow for the ODS load following the load Dependencies.
- Involved in fixing Invalid Mappings, testing of Stored Procedures and Functions, Unit and Integration Testing of Informatica Sessions, Batches and Target Data.
- Solely responsible for the daily loads and handling the reject data.
- Generated queries using SQL to check for consistency of the data in the tables and to update the tables as per the requirements.
- Involved in the migration of existing ETL process to Informatica Power center.
- Involved in migrating objects, mappings, workflows from Dev environment to QA environment.
- Replicated operational tables into staging tables, to transform and load data into one single database using Informatica.
- Creating stored procedures and sequences to insert key into the database table.
- Involved in writing Shell scripts for file transfers, file renaming and few other database scripts to be executed from UNIX.
- Extensively Used Shell Scripting to schedule the Informatica Jobs.
- Maintained/Fixed bugs in existing Shell scripts.
- Guided the testing team and the offshore development team and monitored the Implementation.
- Provided support for the Nightly jobs.
Environment: Informatica Power Center 8.6.1, Oracle 10g, PL/SQL Developer, XML, Windows XP, Rational Clear quest, Shell, TOAD.
Confidential
Project Manager
Responsibilities:
- Collaborated with team members in development and adoption of agile methodologies.
- Maintained TEMPeffective communication to keep all stakeholders involved with the project stages.
- Performed quality assurance on the work designed by the project team.
- Collaborated with the client to drive a current state assessment, gap analysis and remediation phase for the project.
- Successfully implemented various project phases including planning, IT build, UAT, and system rollout.
- Coordinate and scheduled joint discovery sessions with subject matter experts across the enterprise.
- Facilitated steering committees and working sessions, plus provided regular updates to senior executives.
- Prioritized project phase in new Technology Development.
- Ensured adherences to required standards and process, established and suggested best practices for quality control, quality testing/auditing, project documentation and process improvements.
- Managed Vendor Management and global collaboration across multiple continental teams and business units.
- Evaluated Communications plans that set realistic expectations that became a platform for quick issue resolution and decision making.
- Customized project modeling for each client on the requirements.
- Organized Cross-functional Project teams managing team members on shore / off shore.
- Collaborated with business, technical architect groups, development teams, unit testing and QA teams to ensure that the business requirements are implemented correctly.
- Established best practices standards and ensured adherence to them.
Environment: Informatica Power Center 8.1.0, Toad, SQL Server 2005, IBM DB2, Teradata 12.0, Erwin Oracle 10g, SQL, Scrum Works Portal, Windows XP Professional.
Confidential
Informatica Developer/Tier 2 Support
Responsibilities:
- Joined the project for support and then moved to ETL development.
- Experience in BMC Remedy tool to log and give solution to Customer’s problem following the SLA timelines as defined by the clients.
- Involved in the Support and Maintenance activities of Siebel Marketing Application.
- Healthy Interaction with the stake holders.
- Worked with SQL queries to provide the customer the extract from DB with the halp of Informatica tool.
- Providing and Simulation of User access rights using test user in the Marketing Application.
- Using XML Spy test users were created and tested against any functionality and defect raised by customer/call center.
- Providing the requested information/extract of users from Siebel Analytic Application.
- Debugging the issues raised by customer and providing the correct solution either by solving the problem or by raising it to the relevant third party.
- Creation of LOV, Loyalty Programs, Account etc.
- Finding, Logging and tracking of defect with respect to functionality.
- Met 100% SLA target individually.
- Involved in interacting with Business Analyst in analyzing the requirements.
- Involved in documenting the Technical and Functional Documents like High Level Design, Low Level Design and System Specification Document.
- Developed ETL mappings, Transformations and Loading using Informatica Power Center.
- Developed various mappings to load data from various sources using different Transformations including Router, Aggregator, Joiner, Lookup, Update Strategy, Stored Procedure, Sorter, Filter, Source Qualifier, Expression, Union and Sequence Generator to store the data in target tables.
- Developed mapping based on the mapping specification document that indicates the source tables, columns, data types, transformation required, business rules, target tables, target columns and data types.
- Involved in Unit Testing, Integration Testing and End-End Testing.
- Captured and documented the Unit Test Results.
- Used workflow manager to create workflows, sessions, and also used various tasks like command, email.
- Defined Target Load Order Plan for loading data into different Target Tables.
- Extensively worked with various Lookup caches like Static cache, Dynamic cache and Persistent cache.
- Created TEMPeffective Test data and developed thorough Unit test cases to ensure successful execution of the data loading processes.
- Created reports using business object functionality like queries, slice and dice, drill down, functions and formulas.
Environment: Informatica Power Center 7.1.4 (Repository Manager, Power Center Designer, Workflow Manager, Workflow Monitor), Oracle 10g, Teradata 12.0, Unix, Window 7/XP, JIRA, SQL Developer 3.0, Flat Files, SQL, PL/SQL, Informatica Power Exchanger, Microsoft tools, Putty, Winscp.
Confidential
Analyst/ Quality Assurance
Responsibilities:
- Quality center for the test management, viz. requirement study, writing test case, test case reference, test execution, test result logging, defect logging & defect tracking.
- Regression test scripts preparation in Jmeter and execution in QA environment.
- Retesting of fixed defects.
- Hands on experience in QTP with VB script.
- Extensive Functional noledge of Siebel Marketing and Analytics application for the verification of various results in the user segments for Campaign deployment.
- UAT of Digital Email campaigns & SMS marketing for countries viz. Austria, Switzerland and India.
- Support and Maintenance activities of Siebel Marketing Application.
- Creation of Email Marketing server/Instant templates/ Account/ Program/ Campaign/ Offer/ Treatment/ Segments and LOV addition.
- Hands on experience in Informatica tool for taking flat file extracts. FTP servers, Putty etc. to run the Workflows and trigger the Informatica mappings for the Source Code updates (One of the key fields of Digital Email Marketing)
- Healthy Interaction with all the stake holders.
- Creation, testing, loading, launching and support of Life Cycle based Campaign, Web/Email Offer, Segment, importing the templates in Siebel Application and attaching them for the UAT Test Users.
- Performing the Verify and Preview feature in Siebel to check the templates has no error in it and debugging if any error occurs.
- Involved in the Correction of the HTML Template received by the third party vendor with the halp of Edit plus Tool.
- Worked intensively in Siebel web service testing using XML Spy and Jmeter Tool.
- Created test users for Systems and Regression testing using XML spy and Jmeter tools.
- Worked with Sql Queries to check the test data in DB level.
Environment: Informatica Power Center 7.1, Windows 2000, Solaris (SunOS 5.8), Oracle 9, Toad 7.6, Putty, FileZilla, SqlPlus.