Do you read the meter
Posted: Mon Jan 30, 2023 9:17 am
So here is a funny thing about technology. We all love our tech, and use it non-stop. However do you read the meter?
It is obvious a CRT-display/TV is going to use more then 100W, and I doubt they use less, but
an LCD will use 100W point blank, but then an LED ( those tiny lights ) will magically use way less,
while an OLED will go down even more.
....
But what are you losing? Man manufacturers claims better picture. As the Apple QLCD from 200X years was top of the line leaving us with LED being the primary manufacturer. But guess what? Your LED tv can stop working just because
A. An LED went out and needs to be replaced.
B. You need to open your television up and use the blow-Dryer on a chip-set area because you get no
picture or sound from an output.
C. Maybe a board needs to be replaced.
.....
I always wondered why my American made television ( that is correct we have manufacturing in the USA ) LCD would always work even if the picture had that pugly blue over-cast at times, in comparison to a SONY, LG, or even Samsung that would go out or have some kind of issue. I have a TV from 2012
or is it 2008 still going strong. Unlike my Amazon TV it does not block my labtop or computers over the HDMI. If I wanted to run a fire-stick I could.
......
But here is the kicker it is an LCD and that burns 100W still. With the kwh prices climbing it makes it a burden on the bill. I did the calculations of a non-stop usage of it and wow I burning that cash like no tomorrow. Then my work computer QLCD which does the best image is also over 100W, but it is ten times better then most displays.
.......
Of course I brought this up because I discovered two tools/programs for my computer.
joulemeter.msi - Is basically something that is able to view what resource is using what power in
your computer. The problem with Joulemeter is that it uses Telemetry ( meaning reports usage data
back to microsoft ) and must be started when loading the operating system. You can turn off the startup and still have it installed. Microsoft has pushed it out of the lime-light because it has integrated it into Visual-Studo later releases. As it original purpose was for programmer/IT/business resources and not consumers.
and a 3rd party called
HWmonitor - Is from the same creators of CPU-ID ( which is used to inform you of which or what hardware vender installation via Identification code and database ). You can see every single aspect of a computer from it's Temperture to it's Electrical ratings. Watch as you run a heavy resource program
and see your 10W jump to 30W in mere seconds.
........................................................................................
I have been thinking about it, seeing family members just leave lights on all over the place, to clients having no AC even inside of auditoriums
It is obvious a CRT-display/TV is going to use more then 100W, and I doubt they use less, but
an LCD will use 100W point blank, but then an LED ( those tiny lights ) will magically use way less,
while an OLED will go down even more.
....
But what are you losing? Man manufacturers claims better picture. As the Apple QLCD from 200X years was top of the line leaving us with LED being the primary manufacturer. But guess what? Your LED tv can stop working just because
A. An LED went out and needs to be replaced.
B. You need to open your television up and use the blow-Dryer on a chip-set area because you get no
picture or sound from an output.
C. Maybe a board needs to be replaced.
.....
I always wondered why my American made television ( that is correct we have manufacturing in the USA ) LCD would always work even if the picture had that pugly blue over-cast at times, in comparison to a SONY, LG, or even Samsung that would go out or have some kind of issue. I have a TV from 2012
or is it 2008 still going strong. Unlike my Amazon TV it does not block my labtop or computers over the HDMI. If I wanted to run a fire-stick I could.
......
But here is the kicker it is an LCD and that burns 100W still. With the kwh prices climbing it makes it a burden on the bill. I did the calculations of a non-stop usage of it and wow I burning that cash like no tomorrow. Then my work computer QLCD which does the best image is also over 100W, but it is ten times better then most displays.
.......
Of course I brought this up because I discovered two tools/programs for my computer.
joulemeter.msi - Is basically something that is able to view what resource is using what power in
your computer. The problem with Joulemeter is that it uses Telemetry ( meaning reports usage data
back to microsoft ) and must be started when loading the operating system. You can turn off the startup and still have it installed. Microsoft has pushed it out of the lime-light because it has integrated it into Visual-Studo later releases. As it original purpose was for programmer/IT/business resources and not consumers.
and a 3rd party called
HWmonitor - Is from the same creators of CPU-ID ( which is used to inform you of which or what hardware vender installation via Identification code and database ). You can see every single aspect of a computer from it's Temperture to it's Electrical ratings. Watch as you run a heavy resource program
and see your 10W jump to 30W in mere seconds.
........................................................................................
I have been thinking about it, seeing family members just leave lights on all over the place, to clients having no AC even inside of auditoriums