GENESYS Redevelopment Project RFP
The Council invites proposals for the redevelopment of the GENESYS model software which simulates the operation of the region’s power system and is used to assess the adequacy of the region’s power supply and the impacts and costs of non-power related constraints placed on the operation of the region’s hydroelectric facilities.
Information on the Project: DEADLINE EXTENDED, See Below
The Seventh Power Plan identified the redevelopment of the GENESYS model as part of two Council action items (ANLYS-22 and ANLYS-23). The redevelopment of GENESYS is a collaborative effort between the Council, the Bonneville Power Administration, regional utilities and other interested parties.
Key Improvements Proposed:
- Time-dependent nature of the hourly hydro capability – model cascading of multiple dams as part of the system dispatch (i.e. simulate plant-specific hourly dispatch as opposed to aggregate hourly hydro dispatch)
- Interaction between assignment of reserves and system capacity – incorporate reserves into an optimized dispatch
- Trade-off between decisions for economics and adequacy – improved market representation
- Representation of limitations on operators in dispatching the system – add fuel accounting and forecast error
The software redevelopment process shall take place over an estimated 18 month period. It is anticipated the first 6-10 months of the project (March 2017-December 2017) will be focused on finalizing GENESYS specifications and algorithms and beginning code work and testing cycles. The last 9 months of the project (January 2018-September 2018) will focus on continued coding and testing. During this latter time period, the model must be sufficiently functional such that the Council can provide stakeholders with an opportunity to provide input into the redevelopment effort. The redeveloped version of GENESYS must be completed, tested, and ready for use by the Council no later than September 1, 2018.
Additional information and instructions are in the full RFP:
Any questions should be sent no later than 5:00 p.m. on Thursday, January 5, 2017 to John Ollis (firstname.lastname@example.org) and will be discussed in the pre-bid conference on January 9 and then posted publicly, along with responses that week.
Due Date for Proposals
For a proposal to be considered, it must be delivered to the Council electronically to:
Sharon Ossmann, Administrative Division Director
Northwest Power and Conservation Council
Vendor Selected by the council: PSR, Inc
Per requests to extend the deadline for submitting RFP proposals for the redevelopment of the GENESYS model software, the Council is extending the deadline for proposals to no later than 5:00 pm Pacific Standard Time (PST) on Friday February 10th, 2017
Pre-bid Conference - January 9, 2017 - View a video of the pre-bid conference
The Council HELD a pre-bid conference from 10:30 am - 12:00 pm PST on Monday, January 9, 2017 to respond to questions potential respondents may have concerning the bidding process, the scope of work, etc.
Questions and Answers (updated1/17/2017)
Would NWPCC entertain a proposal which includes enhancing a third-party model to achieve all the requested functionality?
If so, should we submit two separate bids if we also intend to offer an option to revamp Genesys?
If you think the bids are different enough in deliverable quality, timing and cost, then multiple bids are OK.
How does NWPCC anticipate handling the maintenance of the model after the final release?
Maintenance will depend on the proposal. If the proposal creates an obligation to purchase a license or enter into an ongoing maintenance contract, then that should be made clear in the proposal.
Are any additional details about the condition of the code available? lines of code/sample code/example input and output files/flow charts/software development methodology/3rd party add-ins
The website has technical specifications posted publicly. The existing FORTRAN codebase will be made available upon request, but it was not anticipated that this code would be included in the redevelopment.
Is there existing cascading hydro logic from another source that you are proposing to be coded into Genesys?
No, a large part of the project is implementing the cascading logic in the technical specifications.
Or does it need to be developed from scratch?
The methodology is developed as seen in the technical specifications, there is no existing code owned by the Council for this functionality.
Will subject matter experts provide guidance if so?
Which reserve requirements will be considered?
All reserve requirements listed in the technical specifications are desired. The capability to model other reserves is also a plus.
How about regulation-up/down, spin, non-spin, load following, frequency responsive?
Yes, and potentially others.
What different categories of optimization does GENESYS currently employ (e.g., linear, mixed integer linear programming, iteration between successive optimizations (for H/K), etc.)?The Council currently uses a stand-alone preprocessor program to set limits on sustained (multiple-hour) peaking capability for hydro. This is used in GENESYS for shaping hourly hydro. The preprocessor includes a linear optimization. Another preprocessor program to develop monthly critical rule curves (minimum end-of-month elevations during poor water conditions) also utilizes linear optimization methods.
What different categories of optimization does the Council envision for the redeveloped GENESYS model (e.g., linear, mixed integer linear programming, iteration between successive optimizations (for H/K), etc.)?
Can you please explain your current distributed computing configuration, if any?
We do not use distributed computing for GENESYS, but it is employed for other Council models.
Number of machines to which the model is distributed
Number of total cores to which the model is distributed
Run time (with/without distributed computing)?
Approximately, 2 CPU-seconds per one year simulation, with an aggregate hourly hydro dispatch. We generally over 6,000 simulations, which translates into about a 3 hour run time per study.
Total memory (RAM) requirements, if known.
How does the current run time compare with the specification below?
Run time is expected to be different.
Is this truly a requirement, or more of a target (or rough expectation), considering the final run time will be dependent on the tradeoffs made during development, including the nature of the optimization, desired additional complexity and granularity, number of cores to which the problem is distributed, etc.?
The 12 hour run time per study is a target (or rough expectation).
5.1 Performance requirements
GENESYS should be able to produce a full scenario run in 12 hours or less. A “full” scenario at the very least would be a study that simulates the operation over every combination of wind and temperature years, currently 6,160 games. In the future a full scenario will likely mean running with random combinations of all uncertain variables, which would require many more simulations, perhaps as many as 10,000 or more.
Is the “expandability to AWS (Amazon Web Service) or a similar service” (per below) for distributed computing a desired capability for this redevelopment effort? It is a desired capability, but not a requirement.
Or, is the Council more interested in whether future development opportunities would not preclude going down a path of web-based distributed computing?
It depending on cost and functionality.
In other words, can this redevelopment have one distributed computing approach (e.g., across Council servers), with a path toward distributing to AWS? Or, does an AWS capability (or similar web service for distributed computing) need to be part of the proposed scope? Any clarification of critical needs versus desires and development phases for this capability would be helpful.
AWS does not have to be part of the scope, but some form of distributed computing capability will be required.
“3.2 Hardware Interfaces The program should be capable of interfacing with cluster computing either through the Message Passing Interface (MPI) standard or another technology. Expandability to AWS (Amazon Web Service) or a similar service would be ideal.
3.4 Communication Protocols and Interfaces The main communication protocol will be the requirements for distributed computing. This will likely be some sort of MPI interface.”
Can you explain the rationale for having the objective function for the GENESYS redevelopment be to minimize operating costs rather than just to check if there is a resource supply shortage?
It is one formulation of the problem that works with the requirements, there may be others. However, resource supply is only one requirement of the system, the way it is formulated in the technical specifications.
In our experience, adequacy in the Northwest is highly dependent on an economic operation of the system. One significant reason for this is that the fuel supply for the hydro system is limited. Operators make decisions every day based on economics that consume some of the limited fuel and make in unavailable for adequacy purposes. Capturing this dynamic is a main objective of this redevelopment.
Is the proposed model redevelopment going to be used for more than evaluating resource adequacy?
Will this be a stochastic optimization?
The technical specifications are not formulated as stochastic optimization. However, if the problem can be formulated equivalently to produce the same basic result, then we are open to proposals that include this. Given the temporal dependency in this problem, it would be important to show that it would be prudent to go down that path in the proposal.
Or, will each individual game be a single optimization, repeated for each of several thousand Monte Carlo runs?
The technical specifications are written as if the problem will be done as a traditional Monte Carlo problem, or a pre-filtered Monte Carlo problem using a single optimization for each game. However, it would likely be important to be able to run the full Monte Carlo problem to evaluate any filtering logic. Thus, at least on occasion, it is likely a Monte Carlo simulation with several thousand runs would need to be performed.
Do you consider that the algorithms/equations specified in the “GENESYS Redevelopment Requirements Specifications" document (pp 26-73) are sufficiently prescribed so as to not require new algorithm development by the subcontractor to implement the solution? Or, do you expect that additional algorithm development will be required by the subcontractor? In other words, do you consider this to be more implementation (of an already determined algorithmic approach) or invention (developing the algorithmic approach to solve the problem)?
The algorithms are anticipated to represent the lion share of the constrained optimization problem, but some of the details of the specifications are covered in the narrative and in the current version of GENESYS. That being said, there may be different ways to formulate the problem depending on the solution implementation, and solutions that accomplish the stated objectives do not necessarily have to be formulated in the exact same way as the specifications.
Note that much of the coding that facilitates the setup and execution of the constrained optimization problem (input/output structure, interface, etc.) has been described in narrative and likely would require some sort of invention.
Is it your expectation that if the MILP as specified in the “GENESYS Redevelopment Requirements Specifications" document (pp 26-73) does not efficiently solve for the expected 39 plants to be modeled, that NPCC would relax the number of plants down to the point that the problem will solve (i.e., is your risk mitigation regarding solve-ability of the MILP, which cannot be determined in advance, to reduce the complexity of the cascading hydro problem)? Or, are you expecting the subcontractor to take on the solve-ability risk through development of simpler or alternate algorithms?
It is our expectation that throughout implementation and testing of the solution that the formulation may need to be adapted creatively to achieve the balance of reasonable run time, and enough detail to satisfy the main objectives. It may be necessary to relax some constraints in certain situations, but the expectation is that through distributed computing and adapting the formulation, the problem will be able to solve in a reasonable amount of time. While reducing the number of plants would be one method of relaxation, we would likely look to relaxing other constraints first. There are several existing models that can solve the water-balance equations using individual plants. While these models have not been adapted to be used for reliability studies, they do speak the feasibility of this type of problem.
How many different non-hydro resources do you expect to model (e.g.., X thermal units, Y Demand Response bins, etc.)?
Depends on solution. These estimates are intended to be ball-park. Non-hydro resources not expected to affect run-time in the same way hydro resource dispatch will.
Thermal Units: around 1000
Wind and solar units: around 500
Demand Response Bins: around 50
Thermal Units: <200
Wind and solar: at least 10
Demand Response Bins: at least 8
Where do the deployment stage scenarios come from?
The deployment stage (true-up stage) is primarily for representing the last element of fuel (solar, wind, hydro, gas, etc) and load uncertainty and the effect of forced outages.
Does the NWPCC expect proponents to present some methodology for producing scenarios that are intrinsically coherent in the week, day, hour ahead and deployment stages?
The Council still plans on providing the lions share of the data required to run GENESYS.
Monthly coupling of day-ahead UC runs
The specifications of the new GENESYS show that day ahead commitment problems are solved every hour and initial conditions for the following hour take place after the deployment stage (“realization” of stochastic variables). Moreover, it is said that the model will only start working on the month of November after October is completely solved up to the last hour of the last day. This scheme would imply solving more than 8760 Mixed Integer Linear Programming (MILP) sequentially for each game (scenario combination). If so, supposing the model can perfectly parallelize the 6160 games it would still be necessary to solve 8760 MILPs in each computation kernel. If we assume each of the MILPs takes 30 seconds to be solved it would take 73 hours of computation only to solve 8760 day-ahead unit commitment problems. Significantly more than the specified 12 hours of computation. The question is: how important it is to solve completely and sequentially all the 8760 MILPs? It would be helpful if each of the 14 periods could be solved independently from the other periods.
Council staff is aware there may be computational restrictions to the approach recommended, and is looking for reasonable solutions. It is likely possible to simplify some aspects of the problem.
Note: It is always complicated to set time limits in MIP problems that are combinatorial in nature. It may be necessary to use heuristics / approximations in the UC problems (such as relaxing some integrality constraints for UC binary variables) to solve the entire adequacy problem for all games within the time constraint. This is well understood. To the extent that simplifications are unbiased and reasonable for power system planning purposes, there is no issue with using these methods.
NWPCC system dimensions
In the specs there is a single reference to the number of hydro plants in the section describing the H/K procedure (39 plants). What is the expected number of thermal plants (or units, if modeled as such)?
The same for the variable resources (eg. wind and solar)?
Wind and solar plants will have fuel limitations defined by region. It is possible that these resources could be aggregated at a regional level for this problem if there is any difficultly with computational time from including all the plants.
The question is related to the complexity of the many UC that will be solved and the time constrain that Is set in the specs (12 hours).
Will vendors have access to the GENESYS Model to be able to execute simulations and review the data files?
Who currently owns the GENESYS software?
GENESYS is in the public domain, except for the HYDSIM subroutines, which are a property of the Bonneville Power Administration. It would be preferred that the redeveloped model also stay in the public domain. If the proposal is to do something that differs from this model that should be clear.
Will proposers have access to the GENESYS source code prior to proposing to assess its condition?
It is not anticipated that the existing FORTRAN codebase would be redeveloped, rather some of the approaches and logic would be included in the redeveloped model. The existing code will be made available upon request.
Will the selected vendor maintain the source code?
The preferred approach would be to have a source code repository that is mirrored from the vendor to the Council. Maintenance going forward would likely be coordinated between the Council and the vendor.
Will users license the software from the vendor?
The Council would consider a model where GENESYS is licensed, however, it is preferred that the many regional stakeholders have access to the model to engage in the Council’s planning processes.
How many lines of code exist in the GENESYS Model?
Approx 41,600 not counting the HYDSIM routines.
Does the council prefer a rewrite of GENESYS, or would modifying an existing model that has similar capabilities to GENESYS be an option? If the latter is acceptable, would it require a benchmarking with GENESYS results?
It is not anticipated that modifying the existing model would be less effort than rewriting the model. Certain benchmarks to the existing GENESYS model will be required, such as monthly energy simulation. However, the hourly simulation cannot be compared to the existing GENESYS because the current version just does not have that level of granularity.
Does GENESYS or any of the input models use a third party solver? Would it also require MIP capability for modeling unit commitment of thermal units?
The current GENESYS model does not use third party code, with the exception of the HYDSIM routines. However, the technical specifications are drafted in such a manner that it is anticipated a solver would be required.
The RFP suggests that a new interface should be structured and written in a language that will facilitate easy data modification. Would the Council consider a GUI written in a modern managed language + a Fortran-based core?
Absolutely. In fact, the interface should consider the use of database structures with the long-term goal of having the model access input directly from the database. In addition, we would encourage an interface that allows for management of individual or groups of studies.
Does the proposed solution need to continue to rely on HYDSIM?
The intention is to continue to use HYDSIM for the monthly hydro energy simulation, so that we can benchmark those results to BPA studies. However, the preference is to make the redeveloped GENESYS flexible enough to allow for other monthly hydro simulation inputs.
What is the current runtime for a typical year of simulation?
The current run time to analyze one year over 6,160 games is roughly 3 hours or about 1.5 to 2.0 seconds per game. But this is using aggregate hydro (e.g. one-dam model) for the hourly simulation. The redeveloped GENESYS will simulate the operation of individual hydro projects (up to 80) on an hourly basis.
What is the size of the typical input file and output files?
The typical size of a study folder (containing both the inputs and outputs) is roughly 100 to 120 MB. Output size depends on what debug features are turned on.
What is the estimated number of man-hours that went into developing the GENESYS Model?
Roughly 2-man years of time went into the development of GENESYS. However, a lot of the code used was preexisting. For example, the old user interface was modified from an existing program. Also, HYDSIM was taken in its entirety.
Approximately how many man-hours are spent maintaining the model and fixing bugs per year?
Generally we hire a consultant to help with debugging and other maintenance. We usually contract for about 150 hours of work. In addition, about 2 times that amount of staff time is devoted to debugging and enhancement.
Other than the program feature updates, are there any known issues that need to be resolved in the redevelopment effort?
Not during the redevelopment process. There are other issues and enhancements that may be considered in the future.