Hey folks -
WOPR, the workshop on performance and reliability, is a non-profit peer workshop focused on real experience in the field. WOPR Usually involves a small group (~10 to 30) and consists of a series of experience reports using K Cards - one at a time, lightly pre-organized, with the Q&A going as long as the group has energy.
I went to WOPR last spring and wrote about it for cio.com
This spring, WOPR is coming to Sweden, preceding Let's Test. Dan Downing, one of the organizers, suggested I share the information about it around.
Also - Yes, I am
organizing a TestRetreat in NYC Saturday before CAST, along with Matt Barcomb and Anna Royzman. So if you're going to "Let's Test" anyway, you might consider WOPR. If you're going to CAST ... :-)
Details on WOPR below my signature.
Managing Principal, Excelon Developmenthttp://www.xndev.com
---------- Forwarded message ----------
From: Dan Downing <email@example.com>
Date: Wed, Feb 12, 2014 at 8:22 PM
Subject: FW: WOPR22 Call for Proposals (CFP)
Would you spread the word in your circles about WOPR22 going to Sweden in May?
WOPR22 – Call For Proposals (CFP)
The Content Owner, Eric Proegler, along with the WOPR Organizers, invites you to submit your proposal for WOPR22.
WOPR22 Theme: Early Performance Testing
Performance Testing is usually thought of as an activity conducted against "completed" software deployed on production hardware, or a copy of production hardware. These tests are designed for "realism": simulating a sample of user activity at scales chosen to match expected production loads, in environments with "identical" resource capacity and configuration. The value of this kind of test is seen as its perceived accuracy in predicting how the live system will perform, and how reliable it will be.
This approach is used for validating completed, assembled systems right before go-live, and can be effective for addressing these goals. But what do we do when the software is not complete, when the system is not yet assembled, or when the software will be deployed in multiple environments with different workloads?
In agile/iterative development methodologies, the time between writing code and deploying it has become very short, to the extent where high-fidelity simulation before deployment is often impractical or impossible. Cloud deployments and virtualization have made the quantity of available resources very hard to determine. All of these challenges have pressured testing, measurement, and evaluation of performance, scalability, and reliability to evolve in order to test incomplete and rapidly changing systems.
We would like to hear how you test for performance and reliability against systems that are in development, undergoing major refactoring, and/or are not yet deployed to production-class environments. WOPR22 will share the experiences and learning of practitioners who are testing in these new contexts. We want to learn with you about providing actionable performance and reliability feedback earlier in the life cycle, and how to apply it more broadly.
Contexts for Experience Reports
Your experience might touch on one or more of these questions, or it may not. If you have a story that relates to the theme, we would like to hear it.
When do you start testing for performance and reliability in your projects? What are your entry criteria?
How does your approach to performance testing change when testing a system that does not have "production-class" resources?
How do you test components for performance and reliability? What techniques do you use to substitute for missing pieces of a system?
What do you do when you do not have a usage model for end users? How do you decide which activities to simulate? What does your load model look like?
Have you added performance/reliability testing to a Continuous Integration process? How is that going?
How do you report test results when "realism" isn't present?
About Experience Reports
Experience Reports describe an experience that you had first hand. You will tell your story as a narrative, with (or without) any visual aids you feel might help you explain the situation (architecture diagrams, graphs, etc). Then, you will answer questions about what happened, and the group will discuss issues brought to mind from hearing your story. The goal is not to present tools, methodology, or practice, but to share what you saw and did in your work. Both positive and negative outcomes are instructional.
More guidance on experience reports can be found here.
You can also learn more about Experience Reports by watching a video on the subject here.
Conference Location and Key Dates
Location: Verisure in Malmö, Sweden (near Copenhagen)
WOPR22 Dates: Wednesday - Friday, May 21 - 23, 2014
WOPR Dinner: Tuesday, May 20, 2014
Deadline for Proposals: Monday, February 17, 2014
Invitations Sent: Monday, March 3, 2014
Applying for WOPR
WOPR conferences are invitation-only and generally over-subscribed. We restrict attendance to less than 25 people. We usually have more applications and presentations than can fit into the workshop; not everyone who submits a presentation will be invited to WOPR or asked to speak.
Our selection criteria are weighted heavily towards practitioners, and interesting ideas expressed in WOPR applications. We welcome anyone with relevant experiences or interests. We are always looking to invite new talent, and to identify and support up-and-comers reaching intermediate levels of expertise. Please apply, and see what happens.
Presentations will be selected by the WOPR organizers, and invitees notified by email according to the above dates.
You can apply for WOPR22 here.
WOPR is a not-for-profit, low cost workshop, however we do have expenses and we ask the WOPR participants to help us offset these expenses. The expense-sharing amount for WOPR22 has been set at $300. If you are invited to the workshop, you will be asked to pay the expense-sharing fee to indicate acceptance of your invitation.
WOPR is a peer workshop for practitioners to share experiences in system performance and reliability, allow people interested in these topics to network with their peers, and to help build a community of professionals with common interests. Our constituency is people who are interested in system performance, reliability, testing and quality assurance.
WOPR's primary focus is on evaluating system performance and reliability. This includes performance measurement, load and stress testing, scalability testing, reliability measurement and evaluation, and system and product certification. WOPR is not vendor-, consultant-, or end-user-centric, but strives to provide a mix of viewpoints and experiences. The range of potential interest is broad, from single-device embedded systems to complex ecosystems that are multi-tier, multi-platform, and multi-vendor, having operational responsibility of components spread across multiple organizations.
To learn more about WOPR, visit here, connect at Linkedin and Facebook, or follow @WOPR_Workshop.