There are actually two issues here – one is about the safety of the dose rate (1 microGy or 100 microR per hour); the other is whether or not that dose rate is accurate.
Let’s tackle the first one first. A dose rate of 100 microR/hr (1 microGy/hr) is not dangerous. If this dose rate is accurate, living in it continuously (8760 hours per year) will give you a radiation dose of about 0.9 R/hr (9 mGy/yr). This is not a trivial dose – it’s about three times as much as what we’re normally exposed to in a year from natural sources (on average). At the same time, it’s less than 20% as much as nuclear workers in the US are permitted to receive in a year and a little less than half of what radiation workers in Europe are permitted to receive in a year. In addition, it’s less than half the radiation dose rate I measured in Ramsar Iran, which has the highest natural radiation levels of any inhabited place in the world. The residents in Ramsar do not appear to be suffering any ill health effects from their exposure there – it seems unlikely that the dose rate you mention will cause any harm to you.
Now – on to the second question!
One thing that you have to determine is whether or not the readings you note are accurate, and a lot of that depends on the exact kind of radiation instrument you’re using. I used a GM instrument in the Navy and I continue to use them today – they’re incredibly useful. But I also recognize their limitations; one of which is that they’re not very accurate at measuring radiation dose rate – especially from low-energy gamma radiation and even more so at low dose rates. One of the first questions I’d have to ask is whether it’s a digital display, or an electro-mechanical one with a needle pointing at the dose rate. If it’s the latter, I’d also be interested in knowing if the meter is on the very lowest scale with the needle pointing at the very lowest tick mark on the meter face – if this is the case then I would take that reading with a considerable grain of salt; in general, I try to use a meter only when the reading is somewhere between about 20–80% of the range of the scale.
Another question to ask is the size of the GM tube – a larger (and more expensive) tube is more accurate than a smaller, cheaper one.
But the main factor is that GM tubes – unless they are a type of tube called “energy-compensated” – are not accurate at measuring radiation from more than one specific energy. So if your GM was calibrated (for example) using Cs-137 – which has a gamma energy of 662 keV (1 keV = 1000 electron volts) then it can only accurately measure radiation dose from gammas of that exact energy. If you’re using that to measure natural background radiation – with a lower average energy – then the reading is going to be off by a factor of up to 10. This is because the meter “expects” that every bit of radiation entering it has the same energy as Cs-137; if the radiation is lower-energy then the reading will be higher than the actual dose rates.
Anyhow – my best guess is that the dose rate displayed by your instrument is likely not accurate for the reasons given here. But even if it is accurate, this level of radiation exposure should not be harmful.