2.4.0 Post Release Review - The Good, the Bad, the Ugly
Attendees: Stephen Marquard,
Jim Eng: Resources. Problems that required design changes. Couldn't back up b/c of dependencies from other projects. Was done a little haphazard, they were trying to do too much during the last week. One thing that several people talked about what was a phased UI design freeze, other types of staged work as goals for each release cycle. If doesn't happen for.
Send out URL to strawman proposal
Stephen Marquard: Would like to see a more methodical process.
Sean: Seems like there were a lot of builds. Some never saw the light of day. Some were
Sean - QA fatigue over time. Going over the same ground over and over. Rebuild the test bed was tedious. Some people put off Chicken and egg problem. Perhaps we need unit testing. Have developers perform and pass prior to passing off to QA.
How do we feel about information available to test?
- Seth: Not good, accurate information being made available. Right hand/left hand problem. The only thing that knows what the different hands doing is QA (head).
Spent a lot of time
Hannah: Do we have identified information.
CM API - information is technical. passed onto dev. teams. Where should I be looking for info, technical, functional, ui, how changes affects
What is the purpose of the QA cycle?=
When we ship a release, we should be able to speak
Determine what the functional changes are and be able to verify the way. There needs to be a definitive list. Your project doesn't get into the list, unless you provide a list. What level of detail that we need. Changes 1-5 are backend changes. These are changes that have UI impact. Often there is
The list of project leads is a mile long. And there are figure heads listed. Large sprawling project. Perhaps we need to recommend that we need
Possible to come up with template
- design documentation
- functional specs
- Jira's: describe what they think is required by testing to verify the changes made.
Assignments was mature.
- Chat replacement. Was a semi nightmare.
- Message Center Split is not complete. Release notes
Who cracks down? Should it be us? Should it be . Peter hopes that creating milestones will help alleviate this. Seth disagrees . .this is just another attempt. We need to have come sort of central coordination. Will attempt web bridge via sakai001.
Question: When did we move away from running a tool for a semester. Reminds Hannah of Mark Nortons requirement of tool status. Seth gives community credit for following the requirement model.
Course Management API- no documentation, ask Josh Holzman. You should be required to provide such documentation. It isn't acceptable that this is the support model.
Reviewed the PPR. We're not learning anything. We're evolving very slowly and this goes above QA. Part of this is about product and PR - when it doesn't work out of the box,
Stephen wants to go back to the way we go about testing and dates. How can we get to a point, that we know we've Xyz,
go back to developing test scripts, unit tests; use selenium scripts for the most important functionality. These scripts tend to be very brittle.
Sean: One of the things test for OSP complained about, was the amount of time for site setup. Sean created scripts for site setup.
Stephen: web services to setup site. Selenium for user interface. Can write them to be more robust over time. If we are methodical about this, we should build these one by one. Developers keep them as up to date.
Let's talk a
- JSF- Gradebook
- Velocity tool
- RSF -
Performance Issues - fail to come up b/c we're not testing on environments similar to production. We need to test tools and services with a wide range
Web services is a good way of realistically testing these. Forums is the exception. Should be a tool status requirement. anythings with an API should have it
Have designated times to test together.
- How can we better buidl
- Need more information.
Continuing to recommend that there are a lot of bug fixes. Decreases the workflow on implementors, know that they don't need