Sample Template Example of Beautiful Excellent Professional Curriculum Vitae / Resume / CV Format with Career Objective, Job Description, Skills & Work Experience for Freshers & Experienced in Word / Doc / Pdf Free Download
Download Resume Format
Objective
|
Business Data
Analyst professional seeking a challenging position in data warehousing, data
conversions/migrations, data reporting and data quality initiatives.
|
Profile
|
Motivated,
driven professional possesses strong abilities in SDLC and data related
performance such as analysis, profiling, quality, modeling, metadata,
conversions, requirements gathering and documentation management.
Experienced
in identifying data needs and resolving issues promptly. Proven abilities
with application projects, database engineering, data analysis, data mapping
and ETL methodologies. Experienced in automated and manual testing practices.
Strong
verbal and written communication with customer, business and technical
communities, dedicated
to teamwork, and capable of working independently or as part of a team. Strives to go above and beyond to deliver
business objectives.
|
Technical
|
Oracle 9i, 10g, 11g, SQL, PL/SQL, Curam, Talend, UNIX, DB2, Cognos BI 8.4 (Report
Studio, Query Studio, Analysis Studio), Cognos Connection, Framework Manager,
TOAD, Ab Initio, Erwin, SAP MDM, XML, COBOL, PHP, SAP, DBA Liaison, QMF,
SPUFI, MVS/JCL, PROC, VSAM, TSO/ISPF, MS Office Suite (Access, Excel, Word,
Power Point, Outlook), QTP/WinRunner, Quality
Center/Test Director, FTP, Lotus
Notes, VISIO, SQL Server 2005,.
|
Experience
|
Business Data Analyst/QA
5/2010 – Present Hewlett Packard, Mt Laurel, NJ
Client: State of NJ, DHS/DFD
Responsibilities:
Gained Curam functionality knowledge and expertise for a large data
conversion project which was part of an ERP Solution. Gathered systems
functionality through client interviews and document gathering. Acted as SME
for 3 major state systems for development teams. Created mappings and ETL for
data conversion process. Created client documentation to support business processes
and sign-off.
·
Created data
extract, translate and loading into the Curam application utilizing Curam
BPOs (Business Process Objects) and lower level services.
·
Created data
mappings for the data conversion process using MS Access tool.
·
Created PL/SQL
stored procedures for updating and loading data tables through multiple
stages of data conversion.
·
Utilized SQL
to create ad-hoc reports for clients and reporting.
·
Validated
converted data through each stage of conversion to ensure completeness.
·
Created and
validated Case Management for FAMIS participants including integrated cases
with multiple products, evidence, income support, etc.
·
Mapped and
created Service Plans for OMEGA source system to NJ Cass.
·
Created test
data for Curam test loads ensuring validity of data and relationships to
support NJ Cass functionality.
·
Created technical
specifications for development team based on client meetings/discussions.
·
Created Oracle
target table references to support the MS Access tool used for mapping.
Continually reconciled the Oracle database with system modifications from
client.
·
Performed system analysis of the FAMIS, OMEGA, and Medicaid
systems for Social Services / Health & Human Services systems.
·
Use Talend ETL
tool for developing Metadata based on COBOL Copybooks.
·
Created
multiple documents for the development, technical and business staff.
·
Developed
strong working (interpersonal and communication) relationships with business
community.
·
Acquired solid
understanding of Cúram functionality, including the Cúram 5.2 Data model.
·
Provided
subject matter expertise regarding the Curam software product.
·
Created ETL
scripts to generate, populate and compile database objects, batch processing, bug tracking, versioning, unit testing, and documentation.
·
Created
staging tables and loading the tables with data from production/reference
tables according to the design specifications.
·
Created ETL
scripts to migrate data from flat files to Oracle staging databases.
·
Developed
PL/SQL stored procedures for generating error reports for data validations
and to update oracle tables with pertinent information.
·
Created
Conversion Reports for data errors and anomalies
·
Met deadlines
and managed multiple project tasks simultaneously.
Data Quality Analyst (Consultant)
1/2010 – 5/2010
Fannie Mae, Herndon, VA
·
Executed
data
validation procedures and identified
/ resolved data issues
providing insights by querying and analyzing the PRDW data warehouse.
·
Supported the Property Data Warehouse systems through
interactions with developers and end users to analyze and verify business
requirements.
·
Provided
software development lifecycle skills
to define and develop testing requirements as needed for implementation of
new or enhanced application functionality.
·
Created detailed training documents for user
access to production applications.
·
Provided support to the
·
Interacted with the development team to gather
specifications for new and enhanced application functionality.
·
Performed regression and functionality testing for
enhanced applications in support of development teams.
·
Manually
tested the functionalities of the PRDW application and validated against the
requirements.
·
Created and executed automated test scripts for regression
and functionality testing for the PRDW and their interfacing systems.
·
Support
the PRDW production database in terms of facilitating the
implementation of defect resolutions and functionality enhancements.
·
Created
Visio diagrams to denote the data and process flows of the production
environment.
·
Analyzed
requirements for database changes, entered change requests in ClearQuest, and completed SDLC required
documentation.
Data Warehouse Analyst (Consultant)
9/2009 – 12/2009
Medco Health,
Responsibilities:
Data Analyst working on a Business Intelligence
Platform supporting the marketing campaign data processes and initiatives’. Facilitating data mapping sessions with
business stakeholders, data architects and ETL onshore/offshore staff.
Developing documentation to support the implementation efforts for the new
Campaign Marketing –BIP2 Platform.
§ Coordinating,
scheduling and facilitating data mapping sessions which document
transformations for source to target mappings.
§ Developed
technical documents and mapping templates for the implementation of data
mapping strategies to support ETL processes. Source data validation of
business rules and logic from Teradata database to the target Oracle data
warehouse on a BI Platform.
§ Managed
documentation and developed pertinent documents for project tracking and
status reporting.
§ Created
SQL queries to validate transformations and data quality.
§ Participated
in data modeling and reconciling the integration of
data models from various subject areas.
§
Performed data profiling efforts utilizing Ab Initio
profiling tool for source data patterns, column statistics, inconsistencies
and anomalies as well as analyze individual and multiple columns
to determine relationships between columns and tables.
§
Used Cognos Report Studio for the data validation
process
§
Worked with Cognos Framework Manager to establish
relationships with database modeling.
§ Initiated
data reconciliation practices to align source data with target data
warehouse. Reconciled logical data model to the physical data model for
mappings and ETL processes. Utilized Erwin to update and maintain the data
dictionary.
§ Worked
closely with stakeholders in refining and finalizing transformation and
business rules.
Business Data Analyst, QA, Technical Writer
(Consultant)
9/2007 – 12/2008
Dept of Navy,
Responsibilities: Performed
roles as Business Data Analyst and Quality Assurance Analyst which
included initiating customer
interviews to gather business requirements, gather data requirements and
produce formal documents to support requirement sessions, follow up with
cross functional teams and manage technical teams for timely deliveries.
Participated in data modeling sessions and coordinated data cleansing
priorities for multiple data sources, source to target mapping and analysis
of data from multiple systems. Supported development team and QA efforts.
§ Management
of full development life cycle for the Aircraft Wiring Systems for NAVAIR.
§ Gathered
requirements for a new initiative for the DON to support the wiring
maintenance of aircrafts. Performed client interviews to document current
system functionality
§
Developed documentation to support functional
requirements as well as the As-Is and To-Be business processes
§
Prepare work/data flow documentation based on analysis
of current legacy system. Created pivot table views to present data to
customer for review and reporting.
§
Created documentation to support the workflow and
dataflow diagrams, define the actor roles and responsibilities and
define/document business rules for the to-be system.
§
Developed the “Software Requirements Specification”
(SRS) based on client requirement interviews and business analysis
§
Managed SQL Server database for developing queries and
producing ad hoc reports.
§
Developed the Performance Verification Matrix for the
verification and testing for QA efforts.
§
Created ETL rules and scripts for source data quality
efforts to target db repository. Performed source to target mapping processes
and developed documentation to support efforts.
§
Created data, test cases and scenarios for the quality
assurance testing cycles.
§
Created automated test scripts using macro’s for
repeatable testing process including positive and negative test cases.
§
Created ‘Requirements Traceability Metrics’ for
validating positive/negative test cases for AWIS application to confirm all
scenarios passed/failed testing critrical/non-critical standards.
§
Developed the ‘Data Dictionary’ documentation which
includes data technical and business descriptions, data element
characteristics and the function of that element to support the Aircraft
Wiring Data Warehouse.
§
Developed the application “User Guide” for the backend
data warehouse system and the stand alone system. The user guide was
instituted as the training manual to assist new hires and current personnel
on the new data warehouse application.
§
Supported application developers with functional
specification based on software requirement spec.
§
Participated in data modeling sessions for developing
the logical and physical database designs. Ensure business rules are
represented in new design.
§
Created VISIO diagrams to support the requirement
efforts and for use case instances.
§
Participated in the new gui designs of the current
system to provide a more comprehensive application which coordinates the flow
of data with the natural flow of work.
§
Developed data mapping documentation to ensure
accurate data representation and uncover potential data quality issues.
Reviewed data mapping efforts with development team for consistency and
verification.
§
Created test data for QA efforts to maintain integrity
and to validate functional testing. Manual functional, integration and
regression testing performed, documented and reported to developers,
management and users.
§ Recorded meeting
minutes from weekly PMR and CDR meetings and distributed to management which
was used to develop the next week’s agenda.
|
Data
Analyst (Consultant)
11/06 – 6/07 JP
Morgan Chase,
Responsibilities: Developed
MySQL database system for the tracking and trending of tapes. Managed data
from outside vendors for source to target mappings, data quality and
profiling. Created complex SQL reporting system for an automated process to
deliver trend reports which detail the movement of media movement in and out
of data center.
§ Project
consisted of standardizing the Data Center Operations Tape processing
procedures and policies. Acquiring knowledge about Tape Operations including Veritas
- Silos, Vertices, NetBackup, Direct Attach and home grown systems.
Documented infrastructure and
§ Performed
source to target data mappings and provided documentation to support efforts.
Maintained the MySQL database system for sourced data from disparate systems.
§ Worked
closely with developers in the creation of the ETRS MySQL database system.
Created ddl, dml, sql queries to support the data model.
§ Developed
formal documentation to support management efforts such as Process Flow and
Trend Metrics.
§ Analysis
of trend movements (In/Out), duplicates, impact of 38 data centers, retention
periods, origination of tapes, mismatches and matches, open and closed
containers and root causes of issues. Reported statistics based on inbound
and outbound transactions and on site / off site inventory. Determine root
causes for breaks in tape processing. Created reports to management for
reviewing these statistics.
§ Gathered
and cleansed data extracts from offsite vendors (IM and VRI) in order to
populate the mysql database for the ETRS system to reconcile differences.
§ Formulated
sql queries to extract pertinent information for the reconciliation process.
This included (but not limited to) creating ad hoc tables and queries to get
quick answers and formulate solutions for existing issues.
§ Built
complex Access SQL query system to extract data for automated Excel reporting
process.
|
|
Data Quality Analyst
(Consultant)
10/05 – 9/06 Citigroup Inc.,
NYC
Responsibilities: Performed key
role to maintain the enterprise wide mapping system (excel) for multiple
systems. Supported SDLC efforts from data acquisition through the database
processes through reports and ODS. Presented and managed analysis of data
from disparate systems and sources which was sourced from legacy systems and
targeted for large data warehouse. Developed etl process and scripts for
business rules, data profiling, data quality, data reconciliation and
developed root cause analysis process for rejected records.
§ Developed
methodology for reject analysis process using cross references from raw data
and data movement through each process for reconciliation issues.
§
Gather specifications and maintain reliability of the
Enterprise Mapping System which includes raw data record layouts, Data
Models, ETL rules, search engines, DB2 ODS (Operational Data Store), Oracle
staging tables and Oracle Star Schema for Business Intelligence. Implemented
updates for metadata system using Excel and VB macros.
§
Gathered detailed information and performed analysis
on data elements in order to merge 4 legacy LOB's into the data warehouse.
§
Used AB Initio to develop graphs for analysis and
resolution discrepancies on raw data from various LOB's
§
Analysis of functional and non-functional categorized
data elements for data profiling and mapping from source to target data
environment. Developed working documents to support findings and assign
specific tasks.
§
Review and validate exported XML formats for
verification before imported into the ODS and ADM systems.
§
Worked with Data Steward to define mandatory tasks for
data movement throughout the CRM system. Developed prioritization documents
for team efforts
§
Support the quality assurance effort with data
analysis and execution of SQL queries
for db2 tables in UNIX environment for data validation and verification
processes through multiple environments. Working closely with QA team to
confirm validation.
§
Assisted in the implementation of dimensional
modeling for logical and physical
designs.
§
Developed
test data for several processes required for the data validation effort both
manually and through SQL processes. Assisted ETL team in privatizing data to
satisfy the data security requirements and regulations as well as supplying
rules to enforce.
§
Worked closely with Data Architect to define
requirements for staging tables between environments as new process is being
developed to expedite the processing of new LOB’s into the current system.
§
Utilized AB Initio tool for profiling data from 4
legacy systems before and after ETL was applied to initial data load
§
Participated in the development of best practices
documentation for the Data Profiling process which included full scale of
steps and procedures for quality data. This task has been implemented and
utilized as part of full systems standards documentation.
|
|
Technical Business Analyst
(Consultant)
07/05 – 10/05 InfoSolv Technologies, Inc.,
Responsibilities: Initiated client
meeting for source to target mapping sessions. Developed spreadsheet to detail mappings
and etl scripts to be developed off-shore. Managed off shore personnel in the
development of newly designed oracle database. Created full suite of
supporting documents.
§ Gather
client requirements to determine data conversion effort including source to
target mapping strategies.
§
Develop full set of documentation to support data mapping effort.
§
Conduct data mapping sessions with client and client
partner.
§
Implemented standardization practices for automation
process.
§
Analyzed and documented data flow diagrams which will
be used as a tool for implementing the ETL process.
§
Gather and analyze unique business process and fully
document scenario to ensure no breaks in functionality.
§ Created
ETL processes to be implemented by developers.
|
|
Data Base Engineer / Data Analyst (Consultant)
10/01 - 03/05 Dept of Defense - Computer Science
Corporation,
§
Developed, supported and implemented SAP data staging
conversion processes.
§
Participated in the modeling of logical and physical
designs for data marts and data warehouse.
§
Experienced in ETL processes to support the target
database including bulk data loads.
§
Analysis of existing legacy data systems for
conversions. Used MS Access as an interim means to upload and analyze data
before staged to the oracle db instance.
§
Developed bulk scripts in the UNIX environment for
staging data thru production.
§
Skilled in physical data modeling of database systems
using ErWin for reverse and forward engineering.
§
Experienced in TOAD/SQL for data analysis/validation
§
Developed and implemented Korn shell scripts to load
data to Oracle as part of the ETL process.
§
Experienced in source to target data mapping for a
large SAP project including gap analysis.
§
Apply data to Oracle database using SQL Loader &
imports via UNIX box.
§
Fully documented mapping sessions with user community
and gather necessary user signoffs.
§
Validate data cleansing requirements/strategies for
data elements with client
§
Support Oracle processes via Oracle Client and Unix.
§
Analysis of COBOL/DB2 programs to determine file
configuration/program specifications.
§
Develop implementation strategy documentation.
§ Identify
unique processes in legacy system that will need to be addressed in SAP
deployment.
|
|
Data Quality Control Analyst
(Consultant)
12/98 - 7/01 Pearson Technology Center, Old
§
Implemented test automation using Mercury Interactive
testing tool WinRunner
§
Customized Winrunner script using TSL code
§
WinRunner successfully used for automating the order
entry system for test purposes
§
Assisted in the development of Business Process
Scenarios for testing SAP go live
§
Project Lead for a full life cycle test for The
Consumer/Order Entry System.
§
Evaluated the XPEDITER /Code Coverage tool. This tool
is being utilized to determine logical path of data executed in mainframe
programs. It is also being incorporated to determine functionality in the
mainframe process when no business rules have been documented.
§
Created detailed specifications for team members to
implement
§
Successfully integrated Wally World system into the
existing Universal Order Processing System
§
Maintained and enhanced internal COBOL/Embedded SQL
programs.
§
Created/maintained DDL for implementing DB2 objects
such as database, tablespace, tables, alter.
§
Performed DB2 table loads to produce replica of
production DB2 tables including runstats and image copies.
§
Built entire systems for quality control, which
included to Procs, JCL, Control cards, Bind/DBRM libraries, load libraries,
etc
§
Created JCL to execute mass binds of multiple plans
§
Created queries/reports using QMF queries, forms and
reports
§ Revised
strategy for new systems testing for quality control, which would allow for
more accurate processing. This included data integrity issues and file
retention on back up tapes.
|
|
Education
|
Essex
County College/ NJIT
Major: Mathematics
|
AVTECH Institute of Technology,
Certificate:
E-Commerce/ Web Development - 2009
Certificate:
Quality Assurance Engineering includes
(
Data
Warehouse Analysis (
|
|
Background Clearances
|
1/2003 National Agency Check ( 7year background check)
Opened 1/29/2003 / Closed 3/06/2003 Investigating Agency: Performed by OPM Government HR Department |
References
|
References
are available on request.
|
Download Resume Format
0 comments:
Post a Comment