Any usability cost-benefit analysis should value people's time based on their fully loaded cost and not simply on their take-home salary. The cost to a company of having a staff member work for an hour is not that person's hourly rate but also includes the cost of benefits, vacation time, facilities costs (office space, heating and cleaning, computers etc.), and the many other costs associated with having that person employed.
In principle, the value of time wasted on having to use a bad user interface should be considered to be marginal time. Thus, the theoretically correct way to account for the cost of this time is not the average cost of the employees' time but the marginal value of their time, necessitating the use of a so-called hedonic wage model. For practical purposes, it is rare that marginal values are known or even easy to estimate, so it is quite common to use average values. The simplest way to derive the average loaded cost of an employee is to count up your total corporate expenses and divide it by the total number of productive hours worked. Whether you want to count, say, staff meetings as productive time is obviously a matter of interpretation, but the point is to not count time spent on training seminars, lunch breaks, and similar activities that may be necessary but do not generate output.
Commonly, the fully loaded cost of an employee is at least twice his or her salary. This is why consultants charge so much more than regular employees: their billable hours have to cover the many overhead costs that are implicit for your full-time employees. In fact, looking at common consulting rates for the kind of staff you are dealing with is a shortcut for estimating the fully loaded value of your employees' time.