Short explanation:
If an algorithm is of Θ(g(n)), it means that the running time of the algorithm as n (input size) gets larger is proportional to g(n).
If an algorithm is of O(g(n)), it means that the running time of the algorithm as n gets larger is at most proportional to g(n).
Normally, even when people talk about O(g(n)) they actually mean Θ(g(n)) but technically, there is a difference.
More technically:
O(n) represents upper bound. Θ(n) means tight bound. Ω(n) represents lower bound.
Big O means your algorithm will execute in no more steps than in given expression asymptotically (like n^2)
Big Omega means your algorithm will execute in no fewer steps than in the given expression asymptotically (like n^2)
When both condition are true for the same expression, you can use the big theta notation....
f(x) = Θ(g(x)) iff f(x) = O(g(x)) and f(x) = Ω(g(x))
Basically when we say an algorithm is of O(n), it's also O(n2), O(n1000000), O(2n), ... but a Θ(n) algorithm is not Θ(n2).