We have a content databse that is 300 GB in size. No problems with backups after switching to Lite Speed. Before the switch we would see serious performance degradation with web sites.
For the record we did NOT want to have a Content DB this large. We had specific business requirements around content sharing that would have been very difficult to implement if we had put the content in seperate site collections.
When we first went live we had major locking issues with the database during peak usage. We traced this back to use of CrossListQueryCache object in SharePoint. We changed from using that API and it fixed a lot of our performance.
I wrote up a little blog article with more information here.
We still see locking issues with certain types of updates (deleting blobs > 20 MB), renaming webs (this can cause updates to lots of records in AllUserData table. We are working with MS Support on specific cases (i.e. removing large items from recycle bin). These have been traced back to the way specific stored procedures in SharePoint are deleting data, but we do not have a solution yet.
Personally I think problems occur after you get so many records in the AllUserData table and the easist way for MS to communicate this to people was to say stay below 100 GB.
I suggest pinging the people at MS IT... I have heard off the record that they have a SharePoint Content DB > 800 GB.