our everyday life

What Happens if One Signs a Job Offer?

by Kristin Swain

It's very exciting to receive a job offer. A job offer is a contract between you and the employer that outlines the terms of your employment, such as your start date and salary. Signing on to that offer means you agree to the terms of employment and accept the job.

Once You Accept

Once you accept a job offer, you are ethically obligated to abide by its terms. Most companies see a job offer as a contract in which you accept the job and will agree to start work on a specified date. The company ceases its employee search. The hiring manager starts making arrangements for your start date, such as setting up your office space, ordering business cards and ensuring that you have the computer access you need for your first day. It is also expected that you have stopped your job search.

Changing Your Mind

You might have a situation in which you receive another job offer after accepting another position, leading you to refuse the first offer. Backing out of a job offer once you've signed is considered bad workplace ethics and possibly a breach of contract. Though most employers won't follow through with legal action on a candidate who changes his mind, it does reflect badly on you and may follow you if you were to apply for another job with that company in the future.

About the Author

Residing in Los Angeles, Kristin Swain has been a professional writer since 2008. Her experience includes finance, travel, marketing and television. Swain holds a Bachelor of Arts in communication from Georgia State University.

Photo Credits

  • Siri Stafford/Photodisc/Getty Images