how many ounces in a cup and how many ounces in a pound?


  • @i-am-male many thanks, that's what was required

  • Watch Anime Eyes

    @jessica-mormont sarcasm is unbecoming of you young chad. The point is fucking go to google for these types of things


  • @i-am-male AI is a dangerous thing


  • @thestrangest As is global warming, and pandemics, and giant space rocks.


  • @thestrangest awww poor foul mouthed baby, i apologize for hitting that nerve so hard....
    may peace and tranquility find you and google be your guide


  • @i-am-male said in how many ounces in a cup and how many ounces in a pound?:

    @thestrangest As is global warming, and pandemics, and giant space rocks.

    Artificial Intelligence is a much bigger danger than all of those


  • @jessica-mormont said in how many ounces in a cup and how many ounces in a pound?:

    @thestrangest awww poor foul mouthed baby, i apologize for hitting that nerve so hard....
    may peace and tranquility find you and google be your guide

    You are very smug and not funny at all


  • @thestrangest Depends where you're living ;)


  • @i-am-male how so?


  • @thestrangest Hurricane season has already gotten a lot worse for those living along the East coast, antibiotic resistance is becoming a big issue and bioterrorism will become a real threat now that CRISPR is accessible to virtually anyone - which is a version of gene editing that even dummies can access and afford. Imagine creating a virus that combines ebola, the flu, and HIV, that's possible now. Giant space rocks wiped out the dinosaurs, I don't think ruling them out is a good idea.

    AI is limited by electricity, so even if machines take over the world, we would just have to pull the plug on electricity and wait for them to run out of juice. Or, go somewhere with high humidity so they'll rust.


  • @i-am-male said in how many ounces in a cup and how many ounces in a pound?:

    @thestrangest Hurricane season has already gotten a lot worse for those living along the East coast

    You do realise that we're talking about what more likely to end human life. If global warming was the reason you do realise that'll take at least another 100 years from now but with AI 25 years is the minimum

    antibiotic resistance is becoming a big issue and bioterrorism will become a real threat now that CRISPR is accessible to virtually anyone - which is a version of gene editing that even dummies can access and afford. Imagine creating a virus that combines ebola, the flu, and HIV, that's possible now. Giant space rocks wiped out the dinosaurs, I don't think ruling them out is a good idea.

    1st You do not realise that as much as CRISPR would be widespread in the near future, these types of big genetic modifications are still tricky. It will take another 30 years before bioterrorism could possibly become a thing right? And even then that wouldn't easily constitute the end of all human life because just as easily CRISPR could be used to create a life erasing virus it could be used to create an antidote. An astroid hitting earth in the next 100 years would have been detected by now and by then we would probably be an interplanetary species.

    AI is limited by electricity, so even if machines take over the world, we would just have to pull the plug on electricity and wait for them to run out of juice.

    That is not how it works. Not only would cutting all electricity all over a country immediately result in a lot of deaths because of our dependence on water, food and other things that could only be accessed using electricity if that's what you meant by cutting electricity but AI would definitely not be dumb enough to somehow let us just cut off electricity, AI would most likely be.

    Or, go somewhere with high humidity so they'll rust.

    Do i really have to explain how stupid this is. We Most electronics are already made out of materials that can't rust so why would we degrade quality of materials for AI


  • @thestrangest said in how many ounces in a cup and how many ounces in a pound?:

    You do realise that we're talking about what more likely to end human life. If global warming was the reason you do realise that'll take at least another 100 years from now but with AI 25 years is the minimum

    This is with respect to my earlier answer "Depends where you're living", so it's fathomable to think people living on the coast will experience more intense storms and face a higher risk of mortality. (https://www.gfdl.noaa.gov/global-warming-and-hurricanes/)

    Hurricane Frequency

    1st You do not realise that as much as CRISPR would be widespread in the near future, these types of big genetic modifications are still tricky. It will take another 30 years before bioterrorism could possibly become a thing right? And even then that wouldn't easily constitute the end of all human life because just as easily CRISPR could be used to create a life erasing virus it could be used to create an antidote. An astroid hitting earth in the next 100 years would have been detected by now and by then we would probably be an interplanetary species.

    I think you're underestimating the rate at which CRISPR is spitting out new knowledge into the world. (https://www.elsevier.com/research-intelligence/campaigns/crispr) 2013-15

    It's terrorism for a reason, consider that we haven't completely wiped out all computer viruses. It takes time to develop a vaccine or cure to any pathogen, and if bioterrorists really wanted to make something tricky, it could take a long time to cure it.

    EDIT: forgot to address the asteroid point, here's a quotation from NASA (NEO - near earth objects)
    "With so many of even the larger NEOs remaining undiscovered, the most likely warning today would be zero," NASA informs us. We would see nothing at all until suddenly, just as the impact occurred, we noticed a "flash of light and the shaking of the ground as it hit." Then poof.
    (https://www.forbes.com/sites/erikkain/2013/01/10/we-probably-wouldnt-even-see-a-doomsday-asteroid-until-it-was-too-late/#3fc3c5737735)

    That is not how it works. Not only would cutting all electricity all over a country immediately result in a lot of deaths because of our dependence on water, food and other things that could only be accessed using electricity if that's what you meant by cutting electricity but AI would definitely not be dumb enough to somehow let us just cut off electricity, AI would most likely be.

    Currently our technology is still tame enough that AI couldn't just use robots to build itself and its power source a nice bunker to defend itself without us knowing. Worst case scenario we nuke the facilities supplying power to it, assuming another country nukes us (AI shouldn't have access outside national borders, and if it did grow this capable, I'd imagine we'd shut it down much sooner).

    Assuming we could cut the electricity, we should be able to cut AI off the grid relatively quickly and restore power to essential services. If you want to argue there's a risk involved, ... considering that the tradeoff is the end of the world, I think it's justifiable to cut the power. Also, it's feasible that water could still be conveyed to a large extent just by gravity flow, there would only be some places that would experience outages.

    Do i really have to explain how stupid this is. We Most electronics are already made out of materials that can't rust so why would we degrade quality of materials for AI

    Mainly saying if AI uses drones to hunt for humans, it's feasible that they wouldn't last indefinitely from risk of rusting. (https://forum.dji.com/thread-57238-1-1.html)


  • @i-am-male I m impressed from your knowledge about the world :)


  • @girlnextdoor It's probably still not enough, but I'm open to learning and revising what I know.


  • @i-am-male I am having you know that i do plan on responding to you with other articles after reading and responding to the articles you've shown to explain why AI is more dangerous but i'm postponing that a lot since it'll take time. Since Elon Musk, Bill gates and Stephen Hawking all agree that AI is the most dangerous thing being worked on and will probably be at the level we all fear by 2050, am i wrong to say that the things you are mentioning seem be all after that date? Because that's my initial argument and i don't see that you've addressed it head on


  • @thestrangest Well to be fair, I only meant that global warming, pandemics, and giant space rocks are also dangerous things, following the chain of discussion. There's no way of actually knowing when any of these (including AI taking over the world) will occur. I further qualified by saying "depends where you're living" to tip the scales in favor of global warming.

    I don't think we ever established an initial argument, especially considering I was the one who brought up AI. (I recommend going back and reviewing the thread)

  • Watch Anime Eyes

    @i-am-male yeah that's fine. Of course you don't know which one's going to occur first and neither do i, ignore the previous thread. I meant that AI is more dangerous because it is more LIKELY to occur first andand didn't mean anything else but sure. By the way it is always possible to know what's more likely to happen. I never said that CRASPR and global warming were not dangerous but hey. I don't know