← Back to Upcase

How thorough are you with integration tests? (testing forms)


(Patrik Bóna) #1

Let’s say that I am going to have form with 20 fields.

How would you test it? Would you fill every field and then make sure that data from every field is saved?

Or would you fill just required fields and just check for success message?

I use first option, but sometimes it is cumbersome. Test are more robust, but little bit harder to maintain…

What is you approach?


(Jason Draper) #2

I don’t think there is a one size fits all type of approach in these types of things, you have to weigh a few options:

  • How horrible would it be if a field doesn’t save? Is this something business crucial or slightly annoying?
  • How easy will it be to find out when a field doesn’t save? Is this something you’ll notice based on another test? Will your users even notice when they submit the form or will they be really annoyed 6 months down the road when they find out the data didn’t save?

Most of the time I would write out an integration test that hits every single field but that shouldn’t make your tests that much harder to maintain unless your application is primarily based around a lot of models with 20 fields. If so, I feel for you and will shed a tear!

The other option would be to only test the required fields through an integration test and the use a controller spec to make sure that you can save all the other data. This leaves your form itself untested but that is a tradeoff that you may be willing to make sometimes.


(Joel Oliveira) #3

I’m making some assumptions here so …

The underlying data will more often than not be a model, or set of models, yes? The models themselves should be properly validated, and have the validations tested. As a result your form will likely end up with a very binary outcome - all fields were entered and valid - OR - there was an error. Yay or Nay.

Having said that, I’ll usually test the happy path (all is good) and one permutation where a field is off, or invalid. That’ll get the good and bad scenarios.

With random edge cases or situations you end up seeing pop up as issues or problems (through airbrake, honeybadger, errbit, etc), I’ll wrap a failing controller test around to, hopefully, prevent regressions and make it pass.