IPC분류정보
국가/구분 |
United States(US) Patent
등록
|
국제특허분류(IPC7판) |
|
출원번호 |
US-0475408
(1999-12-30)
|
발명자
/ 주소 |
- Van Horn, Tom
- Gustafsson, Niklas
- Woodford, Dale
|
출원인 / 주소 |
|
대리인 / 주소 |
|
인용정보 |
피인용 횟수 :
132 인용 특허 :
56 |
초록
▼
An online buying group (referred to herein as a "co-op") is formed for the specific purpose of purchasing a particular product at (102) by defining a start time, end time, critical mass, any minimum number of units offered, any maximum number of units offered, starting price and product cost curve.
An online buying group (referred to herein as a "co-op") is formed for the specific purpose of purchasing a particular product at (102) by defining a start time, end time, critical mass, any minimum number of units offered, any maximum number of units offered, starting price and product cost curve. As data is gathered from buyers, by means of their making binding purchase offers, the co-op is modified at (108) using a pricing tool, so as to take into account for this market data in the definition of the price curve. A buyer chooses a product co-op of interest at (114). The buyer is presented with the following essential co-op information: current price, closing time, next price level (as defined by a price curve visibility window and the price curve) sufficient to entice the buyer to make an offer. Once a buyer has made up his mind, the decision must be made at (116) to offer a purchase price which includes the current price, guaranteeing availability if critical mass has been achieved, or to make an offer at a lower price range that can be accepted only if the co-op price drops to that level, which may not occur. Given a decision to make an offer at such lower price, the buyer enters such maximum price at which he is willing to purchase the product at (118). Should the current price drop to the level at which the offer was made, the price contingency is removed from such offer and assuming critical mass is achieved, the offer is accepted at at the close of the co-op at (122), and processed accordingly. Inventory is allocated to fulfill the accepted offer at (126) following the closing of the co-op at (124).
대표청구항
▼
An online buying group (referred to herein as a "co-op") is formed for the specific purpose of purchasing a particular product at (102) by defining a start time, end time, critical mass, any minimum number of units offered, any maximum number of units offered, starting price and product cost curve.
An online buying group (referred to herein as a "co-op") is formed for the specific purpose of purchasing a particular product at (102) by defining a start time, end time, critical mass, any minimum number of units offered, any maximum number of units offered, starting price and product cost curve. As data is gathered from buyers, by means of their making binding purchase offers, the co-op is modified at (108) using a pricing tool, so as to take into account for this market data in the definition of the price curve. A buyer chooses a product co-op of interest at (114). The buyer is presented with the following essential co-op information: current price, closing time, next price level (as defined by a price curve visibility window and the price curve) sufficient to entice the buyer to make an offer. Once a buyer has made up his mind, the decision must be made at (116) to offer a purchase price which includes the current price, guaranteeing availability if critical mass has been achieved, or to make an offer at a lower price range that can be accepted only if the co-op price drops to that level, which may not occur. Given a decision to make an offer at such lower price, the buyer enters such maximum price at which he is willing to purchase the product at (118). Should the current price drop to the level at which the offer was made, the price contingency is removed from such offer and assuming critical mass is achieved, the offer is accepted at at the close of the co-op at (122), and processed accordingly. Inventory is allocated to fulfill the accepted offer at (126) following the closing of the co-op at (124). associating the expression with the performance area; determining the expression for the selected performance area; and including the expression associated with the selected performance area in the evaluation. 6. The method of claim 5, further comprising using the expression to score productivity data associated with the evaluation. 7. The method of claim 5, further comprising: storing a weight for each of the questions; for each question in the evaluation, including the weight for the question; using the weights in the evaluation to calculate a quality score for the selected performance area; using the expression in the evaluation to calculate a productivity score for the evaluation; and combining the quality and productivity scores to determine a performance score for the evaluation. 8. The method of claim 1, further comprising: receiving a selection of a guideline for the evaluation; dynamically determining each performance area associated with the selected guideline, the selected guideline comprising the dynamically determined performance areas; and dynamically determining questions associated with each of the dynamically determined performance areas. 9. The method of claim 8, further comprising: for each performance area, storing an expression operable to calculate a productivity score for the performance area based on productivity data associated with the performance area; associating the expression with the performance area; determining the expression for each of the performance areas associated with the selected guideline; and including the expressions in the evaluation. 10. The method of claim 8, further comprising associating a plurality of guidelines with one performance area. 11. A performance evaluation system, comprising: a database comprising a first database table that stores a plurality of questions associated with a call center, a second database table that stores a plurality of performance area identifiers, each performance area identifier corresponding to a performance area associated with the call center, a third database table that associates each of the performance areas with at least one of the questions and to associate a plurality of the performance areas with one question, a fourth database table that stores a plurality of guideline identifiers, each guideline identifier corresponding to a guideline associated with the call center, and a fifth database table that associates each of the guidelines with at least one of the performance areas; and a database manager that receives a selection of at least one of the performance areas for an evaluation, to interrogate the database tables to dynamically determine questions associated with the selected performance area and to generate the evaluation and include the dynamically determined questions in the evaluation. 12. The performance evaluation system of claim 11, wherein the first database table defines a weight for each of at least a subset of the questions. 13. The performance evaluation system of claim 11, the database further comprising a sixth database table that associates each question with a predefined answer type. 14. The performance evaluation system of claim 11, the database further comprising a sixth database table that defines a target score for each of at least a subset of the questions. 15. The performance evaluation system of claim 11, the database further comprising: a sixth database table that defines a plurality of expressions, the expressions each operable to calculate a productivity score for a performance area based on productivity data associated with the performance area, and a seventh database table that associates the expressions with the performance areas; and wherein the database manager interrogates the database tables to determine an expression associated with the selected performance area and to include the associated expression in the evaluation. 16. The performance evaluation system of claim 11, wherein the database manager receives a selection of at least one of the guidelines, to dynamically determine performance areas for the selected guideline, and to dynamically determine questions associated with each of the dynamically determined performance areas for the selected guideline. 17. The performance evaluation system of claim 16, wherein the fifth database associates a plurality of guidelines with one performance area. 18. A method for generating an evaluation in a performance evaluation system, comprising: storing a plurality of questions associated with a call center; storing a plurality of performance area identifiers, each performance area identifier corresponding to a performance area associated with the call center; associating each of the performance areas with at least one of the questions; storing a plurality of guideline identifiers, each guideline identifier corresponding to a guideline associated with the call center; associating each of the guidelines with at least one of the performance areas; associating a plurality of the guidelines with one of the performance areas; receiving a selection of a member of the call center for evaluation of the member; receiving a selection of at least one performance area for an evaluation of the member; dynamically determining questions associated with the selected performance area, the selected performance area comprising the dynamically determined questions; and automatically generating the evaluation and including the dynamically determined questions in the evaluation. 19. The method of claim 18, further comprising: storing a weight for each of at least a subset of the questions; and including weights for the questions associated with the selected performance area in the evaluation. 20. The method of claim 18, further comprising: storing a target score for each of at least a subset of the questions; and including target scores for the questions associated with the selected performance area in the evaluation. 21. The method of claim 18, further comprising: associating a predefined answer type with each of at least a subset of the questions; and including predefined answer types for the questions associated with the selected performance area in the evaluation. 22. The method of claim 18, further comprising: storing an expression operable to calculate a productivity score for a performance area based on productivity data associated with the evaluation; associating the expression with the performance area; determining the expression for the selected performance area; and including the expression associated with the selected performance area in the evaluation. 23. The method of claim 22, further comprising using the expression to score productivity data associated with the evaluation. 24. The method of claim 22, further comprising: storing a weight for each of the questions; for each question in the evaluation, including the weight for the question; using the weights in the evaluation to calculate a quality score for the selected performance area; using the expression in the evaluation to calculate a productivity score for the evaluation; and combining the quality and productivity scores to determine a performance score for the evaluation. 25. A method for generating an evaluation in a performance evaluation system, comprising: storing a plurality of questions associated with a call center; storing a plurality of performance area identifiers, each performance area identifier corresponding to a performance area associated with the call center; associating each of the performance areas with at least one of the questions; associating a plurality of the performance areas with one question; storing a plurality of guideline identifiers, each guideline identifier corresponding to a guideline associated with the call center; associating each of the guidelines with at least one of the performance areas;
※ AI-Helper는 부적절한 답변을 할 수 있습니다.