Sample Template Example of Beautiful Excellent Professional Curriculum Vitae / Resume / CV Format with Career Objective, Job Description, Skills & Work Experience for Freshers & Experienced in Word / Doc / Pdf Free Download
Download Resume Formathtm
SKILL
SUMMARY:
|
·
Over 14 years of QA experience in all phases of life cycle,
including requirements gathering, risk analysis, project planning,
scheduling, testing, defect tracking, management, and reporting.
·
Adept at using manual to troubleshoot systems, integration, user
acceptance, positive and negative, functionality, object, smoke, stress,
performance and regression
·
1 ½ years testing Financial software, 2 years testing CRM
product, 1 year testing electrical dispatch product, 8 months testing voice
recognition software and 10 years of
pharmaceutical software testing
·
Develop use cases, user interface specifications, and user
requirement specification documents.
·
Liaise with developers, business analysts, and user
representatives in application design and document reviews.
·
Superior analytical, troubleshooting, communication and
presentation skill
·
Over
8 years of
|
|
|
DEVELOPMENT AND
APPLICATION SOFTWARE:
|
·
Quality
Center, Working with Cognos Powerplay and Reportnet, Informatica PowerCenter
7.2.1, SQL Navigator 4.4, Toad, VB6 and VB.NET, Oracle Client, C++ back-end and Visual
Basic front-end, Java 1.2 and 1.3; Crystal Reports, VBA, Sonic, ChainSaw, All
MS Office products, VM Ware, VPN, Citrix, XML Spy, Microsoft
Expression Blend, Web and Application services, Rational ClearQuest, Fortran-based mainframe application, Lotus Notes
·
Databases: Oracle, SQL Server, DB2, AS400, and MS
Access
·
Defect
Tracking Systems: Bug Tracker, Rational Clearquest,
·
Document
repositories: DM Extension, Documentum, SharePoint
·
Testing
Methodologies: Waterfall, Scum, Iterative
|
PROFESSIONAL
EXPERIENCE:
|
BNYMellon (Mutual Fund exchange)
Dates: Sept 2010 to Present
Company/Title: Consultant with CEI/SR. QA Analyst
Tested
application: Mutual Fund main
processing software
Development
and Application Software: Fortran-based
mainframe application (DCL commands), JIRA (defect tracking), Lotus
Notes
Responsibilities:
·
Build a high level of knowledge
of system interfaces using mainframe and
client-server applications
·
Defines, develops and
maintains test scripts. Reviews test scripts and
provides feedback to less experienced
team players.
·
Translate
business and technical requirements to develop and execute
comprehensive system test estimates, test plans, test cases and test scripts
·
Assist
programmers in translating business requirements and establishing
unit test data
·
Identify,
document, and research discrepancies between requirements and
test results
·
Find and
develop test database for functional areas that are being tested.
·
Facilitate
detailed test plan reviews held before and after test execution with key
departments within the company.
PJM (Electrical source company)
Dates: July 2009 to July 2010
Company/Title: Consultant with YOH/SR. QA Analyst
Tested
application: Dispatcher UI software for Energy sharing
daily output
Development
and Application Software: SOA, Microsoft
Expression Blend design package (XAML output), C sharp, Oracle 9G, Agile, Microsoft Dot Framework, Citrix, XML Spy, Web
and Application services, DM Extension, SQL server
Responsibilities:
·
Validated two projects that
were developed using Agile development.
·
Track, analyze, and report defects
using the designated defect management system.
Project 1– IEP:
Warning
and Alarm messaging software for dispatcher to any issue to Electrical source
that has been monitored
Project 2 – SCED:
Tracking
electrical usage for multiple types of energy sources using real time
information along with historical. Validation of views, inputs of UI based
project.
Blue Cross and Blue Shield (Health Insurance)
Dates: July 2007 to July
2009
Company/Title: Consultant with Keystone Computers/SR. QA Analyst
Tested
application: Claim processing software
Development
and Application Software: Batch
processing in AS400/v.5 environment, HP Quality Center
Responsibilities:
·
Worked
closely with business analyst in separate departments to obtain existing
business rules in place to effectively test modules.
·
Complex
claim flows that were never documented had to be defined by meetings with
business owners to determine previous medical processing prior to new
requirements that were being implemented.
·
Create
test plan, test scripts, import sheet of requirements.
·
Work
closely with all departments to ensure product compliance
·
Data
Mining for test results using custom UI SQL tool pointing to AS/400.
·
Management
of testing effort between developers and testers.
.
Merck & Co., Inc. (Pharmaceutical manufacturing co.)
Dates: March 2003 to July
2007
Company/Title: Consultant with Lionbridge / SR. QA Analyst
Tested
application:
Various projects listed below.
Project 1 (Current) – Quest 1.2, 2.0, 6.0 and 7.0:
Sales
reporting tool for Pharmaceutical statistics. Data comparisons for markets,
products, physician scripts, etc. Multiple data feeds from various systems
had to be validated for integration into views that were being used by UI.
Calculations for statistical information were validated using SQL Test
scripts derived from Data design flows, charts and mappings along with
functional and business requirements.
Project 2 – FTR
1.0 and 2.0:
Application
tracks national field reports for sales teams.
Project 3 - TPR:
Reviewing
the process in which data is created and cut for bonuses. Reviewing the many
sources of input data and creating a streamline QC environment for future
data builds.
Project 4 – CFS/JV
data:
Testing
all Quest Suite applications with new alignment data for validation.
Project
5 –
CoP Network Database and Datamart: 06/2004 to 07/2005
This
project is medium to large in scale. The Data Mart was created using roughly
8 different databases (Mainframe, SQL Server, Oracle, and DB2 and various
flat files sources). The data being gathered is for reviewing who is part of
the “community of practice” and who the leaders of those communities are.
Development
and Application Software: XML Access database.
Validating data using Access and MSDE. Validated all feeds going into the ETL
process using Informatica. Quality Center, Working with Cognos Powerplay and
Reportnet, Informatica PowerCenter 7.2.1, SQL Navigator 4.4, VB6 and VB.NET, Oracle Client, C++ back-end and Visual Basic front-end, Java 1.2 and
1.3; Crystal Reports, VBA, All MS Office products, Rational
ClearQuest, Databases: Oracle, SQL
Server, DB2, AS400, and MS Access, HP Quality Center, Documentum,
SharePoint, Waterfall, Iterative
Accomplishments:
·
SDLC certified
·
·
Considered the “go-to” person for the hot jobs.
·
Became very familiar with Data Warehousing and
approaches to testing that is needed to fulfill accurate data.
Responsibilities:
·
QA
Analyst – Lead for offshore and onshore team
·
·
Using
project documentation to create the test plans, test case scenarios, and test
scripts.
·
Analyzing,
tracking, and reporting defects encountered during test activities.
·
Participate
as hands-on test engineer for identifiable application components.
·
Requirement
traceability
·
Track, analyze, and report defects
using the designated defect management system.
·
Organize,
plan and perform test sequence execution.
·
Participate
in peer reviews of project deliverables.
Accomplishments:
·
Constructed
biggest testing bed within
Blue Cross and Blue Shield (Health Insurance)
Dates: July 2007 to July
2009
Company/Title: Consultant with Keystone Computers/SR. QA Analyst
Tested
application: Claim processing software
Development
and Application Software: Batch
processing in AS400/v.5 environment, HP Quality Center
Responsibilities:
·
Worked
closely with business analyst in separate departments to obtain existing
business rules in place to effectively test modules.
·
Complex
claim flows that were never documented had to be defined by meetings with
business owners to determine previous medical processing prior to new
requirements that were being implemented.
·
Create
test plan, test scripts, import sheet of requirements.
·
Work
closely with all departments to ensure product compliance
·
Data
Mining for test results using custom UI SQL tool pointing to AS/400.
·
Management
of testing effort between developers and testers.
.
Merck & Co., Inc. (Pharmaceutical manufacturing co.)
Dates: March 2003 to July
2007
Company/Title: Consultant with Lionbridge / SR. QA Analyst
Tested
application:
Various projects listed below.
Project 1 (Current) – Quest 1.2, 2.0, 6.0 and 7.0:
Sales
reporting tool for Pharmaceutical statistics. Data comparisons for markets,
products, physician scripts, etc. Multiple data feeds from various systems
had to be validated for integration into views that were being used by UI.
Calculations for statistical information were validated using SQL Test
scripts derived from Data design flows, charts and mappings along with
functional and business requirements.
Project 2 – FTR
1.0 and 2.0:
Application
tracks national field reports for sales teams.
Project 3 - TPR:
Reviewing
the process in which data is created and cut for bonuses. Reviewing the many
sources of input data and creating a streamline QC environment for future data
builds.
Project 4 –
CFS/JV data:
Testing
all Quest Suite applications with new alignment data for validation.
Project
5 –
CoP Network Database and Datamart: 06/2004 to 07/2005
This
project is medium to large in scale. The Data Mart was created using roughly
8 different databases (Mainframe, SQL Server, Oracle, and DB2 and various
flat files sources). The data being gathered is for reviewing who is part of
the “community of practice” and who the leaders of those communities are.
Development
and Application Software: XML Access
database. Validating data using Access and MSDE. Validated all feeds going
into the ETL process using Informatica. Quality Center, Working with Cognos
Powerplay and Reportnet, Informatica PowerCenter 7.2.1, SQL Navigator 4.4, VB6 and VB.NET, Oracle Client, C++ back-end and Visual Basic front-end, Java 1.2 and
1.3; Crystal Reports, VBA, All MS Office products, Rational
ClearQuest, Databases: Oracle, SQL
Server, DB2, AS400, and MS Access, HP Quality Center, Documentum,
SharePoint, Waterfall, Iterative
Accomplishments:
·
SDLC certified
·
·
Considered the “go-to” person for the hot jobs.
·
Became very familiar with Data Warehousing and
approaches to testing that is needed to fulfill accurate data.
Responsibilities:
·
QA
Analyst – Lead for offshore and onshore team
·
·
Using
project documentation to create the test plans, test case scenarios, and test
scripts.
·
Analyzing,
tracking, and reporting defects encountered during test activities.
·
Participate
as hands-on test engineer for identifiable application components.
·
Requirement
traceability
·
Track, analyze, and report defects
using the designated defect management system.
·
Organize,
plan and perform test sequence execution.
·
Participate
in peer reviews of project deliverables.
QVC-IVR (Online shopping network)
Dates: Jan 2002 to August
2002
Company/Title: Consultant with Advasta / QA Tester
Tested
application: Call center
response center
Responsibilities:
·
Define
and prepare detailed test cases by quickly absorbing complex system structure
for Customer Service Voice Response Unit.
·
Recommend
updates to IT community where documented processing flow is incomplete or
inaccurate.
·
Responsible for reviewing test cases
created by other team members and providing constructive feedback.
·
Coordinate
combining test data with test cases.
Report any discrepancies and their recommendations to upper management
and IT community.
·
Report
group progress to upper management on a daily basis through constant
communication and project tracking.
Accomplishments:
·
Identified new test cases that would expose loops
in speech.
·
To
on the most challenging components and
completed in a timely fashion.
Sedona Corporation (CRM for small to
mid-tier banks)
Dates: September 1999 to
September 2001
Tested
application: My Portal
Development
and Application Software:
SQL Navigator, C++ back-end and Visual Basic front-end,
Java 1.2 and 1.3; Crystal Reports, All MS Office products, Oracle, SQL
Server, DB2, AS400, and MS Access, Waterfall
Responsibilities:
·
Responsibilities
included organizing testing project, designing and building System tests,
building test environment, executing project Integration tests, and executing
operations Acceptance tests. The main test types that were performed for each
release were the following: validation testing; specific functional testing,
business acceptance testing, performance testing, and regression testing.
·
Supervising
QA department: Interviewed potential candidates for tester positions. Wrote
SOP’s, test plans and workflow for department. Conducted weekly status
meeting with staff and was involved in the Production Development Team for
company. Recommended automated testing software and problem tracking
software.
·
Other Responsibilities: Software evaluation and
discussing issues with developers. Verifying issues with software are
resolved. Input in the Graphic User Interface for new software components.
Tested in multiple platforms, databases and web browsers. Required extensive
knowledge of SQL statements (verified query statements to Intarsia using SQL
Plus); Conducted web casts and live demonstrations of software functionality.
Reviewed all problems for software before entering into bug tracking software
and monitored each issue from open to close.
Accomplishments:
·
Created a testing department
·
Hired testing staff
·
Decisions of testing software
·
Developed testing methodologies for department
·
Worked
closely with Technical writer to create Training material for future releases
and conducted web casts for peers
|
|
Accomplishments:
·
Constructed largest
testing bed within
Blue Cross and Blue Shield (Health
Insurance)
Dates: July 2007 to July 2009
Company/Title: Consultant with Keystone Computers/SR. QA Analyst
Tested
application: Claim processing software (Batch
processing)
Development
and Application Software: Batch
processing in AS400/v.5 environment, HP Quality Center
Responsibilities:
·
Worked
closely with business analyst in separate departments to obtain existing
business rules in place to effectively test modules.
·
Complex
claim flows that were never documented had to be defined by meetings with
business owners to determine previous medical processing prior to new
requirements that were being implemented.
·
Create
test plan, test scripts, import sheet of requirements.
·
Work
closely with all departments to ensure product compliance
·
Data
Mining for test results using custom UI SQL tool pointing to AS/400.
·
Management
of testing effort between developers and testers.
.
Merck & Co., Inc. (Pharmaceutical manufacturing co.)
Dates: March 2003 to July 2007
Company/Title: Consultant with Lionbridge / SR. QA Analyst
Tested
application:
Various projects listed below.
Project 1 (Current) – Quest 1.2, 2.0, 6.0 and 7.0:
Sales
reporting tool for Pharmaceutical statistics. Data comparisons for markets,
products, physician scripts, etc. Multiple data feeds from various systems
had to be validated for integration into views that were being used by UI.
Calculations for statistical information were validated using SQL Test
scripts derived from Data design flows, charts and mappings along with
functional and business requirements.
Project 2 – FTR
1.0 and 2.0:
Application
tracks national field reports for sales teams.
Project 3 - TPR:
Reviewing
the process in which data is created and cut for bonuses. Reviewing the many
sources of input data and creating a streamline QC environment for future
data builds.
Project 4 –
CFS/JV data:
Testing
all Quest Suite applications with new alignment data for validation.
Project
5 –
CoP Network Database and Datamart: 06/2004 to 07/2005
This
project is medium to large in scale. The Data Mart was created using roughly
8 different databases (Mainframe, SQL Server, Oracle, and DB2 and various
flat files sources). The data being gathered is for reviewing who is part of
the “community of practice” and who the leaders of those communities are.
Development
and Application Software: XML Access
database. Validating data using Access and MSDE. Validated all feeds going
into the ETL process using Informatica. Quality Center, Working with Cognos
Powerplay and Reportnet, Informatica PowerCenter 7.2.1, SQL Navigator 4.4, VB6 and VB.NET, Oracle Client, C++ back-end and Visual Basic front-end, Java 1.2 and
1.3; Crystal Reports, VBA, All MS Office products, Rational
ClearQuest, Databases: Oracle, SQL
Server, DB2, AS400, and MS Access, HP Quality Center, Documentum,
SharePoint, Waterfall, Iterative
Accomplishments:
·
SDLC certified
·
·
Considered the “go-to” person for the hot jobs.
·
Became very familiar with Data Warehousing and
approaches to testing that is needed to fulfill accurate data.
Responsibilities:
·
QA
Analyst – Lead for offshore and onshore team
·
·
Using
project documentation to create the test plans, test case scenarios, and test
scripts.
·
Analyzing,
tracking, and reporting defects encountered during test activities.
·
Participate
as hands-on test engineer for identifiable application components.
·
Requirement
traceability
·
Track, analyze, and report defects
using the designated defect management system.
·
Organize,
plan and perform test sequence execution.
·
Participate
in peer reviews of project deliverables.
|
|
Cognos Powerplay and Reportnet,
Informatica PowerCenter 7.2.1, SQL Navigator 4.4, VB6
and VB.NET, Oracle Client, C++
back-end and Visual Basic front-end, Java 1.2 and 1.3; Crystal Reports, VBA,
All MS Office products, Rational ClearQuest, Databases: Oracle, SQL Server, DB2, AS400, and MS
Access, HP Quality Center, Documentum, SharePoint, Waterfall,
Iterative
Accomplishments:
·
SDLC certified
·
·
Considered the “go-to” person for the hot jobs.
·
Became very familiar with Data Warehousing and
approaches to testing that is needed to fulfill accurate data.
Responsibilities:
·
QA
Analyst – Lead for offshore and onshore team
·
·
Using
project documentation to create the test plans, test case scenarios, and test
scripts.
·
Analyzing,
tracking, and reporting defects encountered during test activities.
·
Participate
as hands-on test engineer for identifiable application components.
·
Requirement
traceability
·
Track, analyze, and report defects
using the designated defect management system.
·
Organize,
plan and perform test sequence execution.
·
Participate
in peer reviews of project deliverables.
QVC-IVR (Online shopping network)
Dates: Jan 2002 to August 2002
Company/Title: Consultant with Advasta / QA Tester
Tested
application: Call center
response center
Responsibilities:
·
Define
and prepare detailed test cases by quickly absorbing complex system structure
for Customer Service Voice Response Unit.
·
Recommend
updates to IT community where documented processing flow is incomplete or
inaccurate.
·
Responsible for reviewing test cases
created by other team members and providing constructive feedback.
·
Coordinate
combining test data with test cases.
Report any discrepancies and their recommendations to upper management
and IT community.
·
Report
group progress to upper management on a daily basis through constant
communication and project tracking.
Accomplishments:
·
Identified new test cases that would expose loops
in speech.
·
Took on the most challenging components and
completed in a timely fashion.
Sedona Corporation (CRM for small to
mid-tier banks)
Dates: September 1999 to September 2001
Tested
application: My Portal
Development
and Application Software:
·
SQL
Navigator, C++ back-end and Visual Basic front-end, Java 1.2 and 1.3; Crystal
Reports, All MS Office products, Oracle, SQL Server, DB2, AS400, and MS
Access, Waterfall.
|
|
·
Customer
Relationship Management software (“Intarsia”) www.sedonacorp.com
·
Responsibilities
included organizing testing project, designing and building System tests,
building test environment, executing project Integration tests, and executing
operations Acceptance tests. The main test types that were performed for each
release were the following: validation testing; specific functional testing,
business acceptance testing, performance testing, and regression testing.
·
Supervising
QA department: Interviewed potential candidates for tester positions. Wrote
SOP’s, test plans and workflow for department. Conducted weekly status
meeting with staff and was involved in the Production Development Team for
company. Recommended automated testing software and problem tracking
software.
Other Responsibilities: Software
evaluation and discussing issues with developers. Verifying issues with
software are resolved. Input in the Graphic User Interface for new software
components. Tested in multiple platforms, databases and web browsers.
Required extensive knowledge of SQL statements (verified query statements to
Intarsia using SQL Plus); Conducted web casts and live demonstrations of
software functionality. Reviewed all problem for software before entering
into bug tracking software and monitored each issue from open to close.
Accomplishments:
·
Created a testing department
·
Hired testing staff
·
Decisions of testing software
·
Developed testing methodologies for department
·
Worked
closely with Technical writer to create Training material for future releases
and conducted web casts for peers
|
EDUCATION
AND
TRAINING:
|
|
|
|
Download Resume Formathtm
0 comments:
Post a Comment