Articles

This post is based on a talk that Gunnar Wagenknecht and I delivered at the Open Source Leadership Summit 2017 and Devoxx US 2017. This content was recently published in the All Eyes on Open Source issue of JAX Magazine.

Running an open source project is easy. All you have to do is make your source code available and you’re open source, right? Well, maybe. Ultimately, whether or not an open source project is successful depends on your definition of success. Regardless of your definition, creating an open source project can be a lot of work. If you have goals regarding adoption, for example, then you need to be prepared to invest. While open source software is “free as in beer”, it’s not really free: time and energy are valuable resources and these valuable resources need to be invested in the project.

Source : https://dzone.com/articles/running-a-successful-open-source-project-1?utm_medium=feed&utm_source=feedpress.me&utm_campaign=Feed%3A+dzone%2Fagile

Testers + Scrum = ?

Several times I’ve had conversations with people who work with Scrum or Agile methodologies who claim they don’t have testers and don’t run into any problems. On the other hand, I have seen testers within these schemes who often feel excluded from the development team. Other testers who have not yet worked in Agile teams question whether there is even room for testers in Scrum.

It’s often touted that everyone in a Scrum team is able to perform different tasks and that all are responsible for quality. But, there are some things that a tester can handle better than others. For example, writing good acceptance criteria requires a tester skillset, as one must keep in mind and worry about certain characteristics such as quality, testability, maintainability, etc. These are all things that the tester role is responsible for obsessing over. Therefore, when you need to write acceptance criteria, you’ll be better off delegating it to someone trained in testing over someone that’s not.

Source : https://dzone.com/articles/can-there-be-testers-in-scrum?utm_medium=feed&utm_source=feedpress.me&utm_campaign=Feed%3A+dzone%2Fagile

For a long time organizations would ban access to social media at work, fearful that employees would spend their day goofing around on Facebook rather than doing any work. So the idea of letting employees play games at work is even harder to square, yet a recent paper argues the case for just that, with the rationale behind the concept very similar to that for allowing social media access.

The paper highlights how roughly half of us experience regular cognitive fatigue at work, whether as a result of stress, anxiety, or any number of things. This is especially problematic in professions where a high cognitive load is placed on workers. So, the team wanted to test whether a short burst of gaming could help to alleviate some of that load and help both our performance levels and mental wellbeing.

Source : https://dzone.com/articles/the-benefit-of-playing-video-games-at-work?utm_medium=feed&utm_source=feedpress.me&utm_campaign=Feed%3A+dzone%2Fagile

Josiah Renaudin: Welcome back to another TechWell interview. Today I’m joined by Cher Fox, of Fox Consulting. She’ll be leading a session at this year’s STARWEST about test automation for data-centric applications.

Cher, thank you very much for joining us. First, could you tell us a bit about your experience in the industry?

Cher Fox: I have over thirty years of experience in the information technology industry, including twenty years of partnering with clients on their data-centric development projects as an independent consultant. My experience has evolved from data analysis, help desk support, network infrastructure, programming, software development, testing, technical writing, and training to business analysis, business intelligence architecture, data modeling/mapping/ETL, data warehouse design/development, and process improvement.

Josiah Renaudin: Why do you think so few data-centric testing tools are targeted to data-related development and testing?

Cher Fox: Data-related development is proprietary per industry and product. With so many different development languages and methodologies, it is very challenging to develop a specific tool to meet so many needs. The few tools that exist reside embedded in a suite that includes design, development, and testing specific to the suite’s workflow.

Josiah Renaudin: Is there an ideal ratio between manual and automated testing, or does it entirely depend on the type of team or project being worked on?

Cher Fox: The ideal ratio between manual and automated testing would completely depend upon the team and project being worked. The development methodology would factor in heavily here, test or behavior driven development would lead to a much higher ratio of automated testing. A team’s maturity, skill set, and project management style would also contribute to the ratio of manual versus automated testing.

Josiah Renaudin: Can you succeed in our modern, agile software era without any test automation?

Cher Fox: Surely a project can succeed without any test automation, but not at the fast pace of today’s modern, agile era. Smaller increments of developed product demand automated testing for the most optimal speed of delivery and regression testing safety.

Josiah Renaudin: Have you worked with a lot of more traditional testers who are nervous that they’re going to be phased out as the industry continues to evolve?

Cher Fox: Any smart resource in any industry should be paying attention to their industry’s evolution and pacing their growth and development appropriately to remain marketable and competitive.

Josiah Renaudin: Why aren’t agile data teams automating their tests as much as they should? What are some general steps they can take today to head toward smart automation?

Cher Fox: This session will explore many reasons why agile data team aren’t automating their tests. Attendees will leave with suggested areas to explore and improve upon within their internal environment before looking to external resources to implement automated testing.

Josiah Renaudin: How can teams use tools they already have to automate data-centric apps?

Cher Fox: This session will explore and demonstrate how teams can use their existing tools to begin automating the testing of their data-centric applications. Attendees will leave with a framework to use tools they may already have to get started.

Josiah Renaudin: More than anything, what central message would you like to leave with those who attend your session?

Cher Fox: Many attendees will come to the session with the idea that the lack of an automated testing tool is the roadblock to their environment’s successful test automation implementation. While this is a contributing factor, there are many other items to consider and address before implementing test automation. Attendees will leave this session with areas to review in their agile environment in preparation to implement test automation. They will also leave with a framework to build using their existing tools once they are ready to begin.

Cher FoxCher Fox (@TheDatanista) brings thirty years of experience as a solution architect, developer, tester, and analyst for business intelligence, data warehouse, and software development industries. Her experience includes more than twenty-five years of training corporate users on IT applications and business processes. A board member for the Colorado Chapter of TDWI (The Data Warehouse Institute), Cher is active in the SQL community, collaborating at local SQL Saturdays, SQL Server and Power BI user groups, PASS Summits, and BA conferences. Cher enjoys sharing her expertise via local, national, and international speaking engagements. Reach Cher at cfox@FoxConsulting.co, Fox Consulting, and at LinkedIn.

By AgileConnection Source : http://ift.tt/2fwGEk9

Selon la méthode agile SCRUM, les développements applicatifs sont organisés selon une récurrence de « Sprints », cycles de développements partiels.
Pendant un Sprint, des réunions appelées « Scrum » établissent le point d’avancement des travaux depuis la dernière réunion et organise la suite jusqu’à la réunion suivante.
Chaque Sprint se termine par une phase de test des fonctionnalités livrées « en l’état ».
Le « Backlog » de produit référence les fonctionnalités implémentées.

Le fait de livrer « au fil de l’eau » les modules applicatifs réalisés permet de vérifier l’appréciation du « client » à chaud et d’opérer sans attendre les corrections et ajustements, dans une situation de correctifs mineurs.