The best way to do this really depends on the quality and nature of your data and queries. For starters, 180MB of data in a single table for products is not a problem, whichever way you look at it. And 30k queries per day is even less of a problem. With a properly configured database, any old desktop can handle this load.
Others have already pointed out your two major options, MySQL or a noSQL database.
If you have a certain number of attributes that exist for every single product (such as manufacturer, price, warehouse number, etc. then your best option is to have columns for these attributes and convert your key/value pairs into a flat table format, with a product ID as the primary key for that table. This will work very well even if some columns are only used by half of the rows, since for most products you will only need to run 1 query to retrieve all their attributes. Considering that this is data about products, I would guess that it is quite likely that this is the structure of your data.
If the attributes vary widely in presence and data type, then you might be better of using a noSQL database, which handle this scenario more efficienty than traditional SQL databases.
Regarding performance: I have previously worked for an e-commerce company, where for a long time the website was provided with data from a MySQL server. This server had 2GB of RAM, the database in total was approx. 5GB in size and under top load the server handled several thousand queries per second. Yes, we had done a lot of query optimization, but this is definitely doable.