Inside a Google data center

Inside a Google data center


MALE SPEAKER 1: A data centre’s
the brains of the Internet. MALE SPEAKER 2: The
engine of the Internet. FEMALE SPEAKER 1: It is a giant
building with a lot of power, a lot of cooling and
a lot of computers. MALE SPEAKER 3: It’s row, upon
row, upon row of machines, all working together to
provide the services that make Google function. JOE KAVA: I love building
and operating data centres. I’m Joe Kava, Vice-President
of Data Centres at Google. I’m responsible for managing
the teams globally that design, build and operate
Google’s data centres. We’re also responsible for
the environmental health and safety, sustainability
and carbon offsets for our data centres. This data centre, here
in South Carolina, is one node in a larger
network of data centres all over the world. Of all the employees at Google,
a very, very small percentage of those employees are
authorised to even enter a data centre campus. The men and women
who run these data centres and keep them up 24
hours a day, seven days a week, they are incredibly passionate
about what they’re doing. MALE SPEAKER 2: In layman’s
terms, what do I do here? FEMALE SPEAKER 1: I
typically refer to myself as the herder of cats. MALE SPEAKER 4: I’m an engineer. MALE SPEAKER 3: Hardware
site operations manager. MALE SPEAKER 2: We
keep the lights on. MALE SPEAKER 1: And
we enjoy doing it. JOE KAVA: And they
work very hard, so we like to provide them with a fun
environment where they can also play hard as well. FEMALE SPEAKER 2: We just went
past the three-million-man-hour mark for zero
lost-time incidents. Three million man-hours
is a really long time, and with the number of people
we have on site, that is an amazing accomplishment. JOE KAVA: I think that the
Google data centres really can offer a level of security
that almost no other company can match. We have an information
security team that is truly second to none. You have the expression,
“they wrote the book on that.” Well, there are many of
our information security team members who
really have written the books on best practices
in information security. Protecting the security
and the privacy of our users’ information is
our foremost design criterion. We use various layers
of higher-level security the closer into the centre
of the campus you get. So, just to enter
this campus, my badge had to be on a
pre-authorised access list. Then, to come into
the building, that was another level of security. To get into the secure corridor
that leads to the data centre, that’s a higher
level of security. And the data centre and
the networking rooms have the highest
level of security. And the technologies
that we use are different. Like, for instance, in
our highest-level areas, we even use underfloor
intrusion detection via laser beams. So, I’m going to demonstrate
going into the secure corridor now. One, my badge has to be on the
authorised list. And then two, I use a
biometric iris scanner to verify that it truly is me. OK, here we are on
the data centre floor. The first thing that
I notice is that it’s a little warm in here. It’s about 80
degrees Fahrenheit. Google runs our
data centres warmer than most because it
helps with the efficiency. You’ll notice that we have
overhead power distribution. Coming from the yard outside, we
bring in the high-voltage power distributed across the bus bars
to all of the customised bus taps that are basically
plugs, where we plug in all the extension cords. All of our racks don’t really
look like a traditional server rack. These are custom designed
and built for Google so that we can
optimise the servers for hyper-efficiency and
high-performance computing. It’s true that
sometimes drives fail, and we have to replace
them to upgrade them, because maybe they’re no
longer efficient to run. We have a very thorough
end-to-end chain-of-custody process for managing
those drives from the time that they’re
checked out from the server til they’re brought to an
ultra-secure cage, where they’re erased and
crushed if necessary. So any drive that can’t
be verified as 100% clean, we crush it
first and then we take it to an
industrial wood chipper, where it’s shredded into
these little pieces like this. In the time that I’ve been
at Google – for almost six and a half years
now – we have changed our cooling technologies
at least five times. Most data centres have
air-conditioning units along the perimeter walls that
force cold air under the floor. It then rises up in
front of the servers and cools the servers. With our solution, we
take the server racks and we butt them right up against
our air-conditioning unit. We just use cool water
flowing through those copper coils that you see there. So the hot air from the servers
is contained in that hot aisle. It rises up, passes
across those coils, where the heat from
the air transfers to the water in
those coils, and then that warm water is then
brought outside the data centre to our cooling plant,
where it is cooled down through our cooling
towers and returned to the data centre. And that process is just
repeated over and over again. To me, the thing that amazes
me about Google and the data centres is the
pace of innovation and always challenging the
way we’re doing things. So, when people say that
innovation in a certain area is over, that we’ve kind of
reached the pinnacle of what can be achieved, I just laugh. [MUSIC PLAYING]

Author: Kevin Mason

100 thoughts on “Inside a Google data center

  1. Immersion cooling of the Google data centres ? With m.2 nvme raid and complete optical transmission. Google data centres needs to upgrade to the current generation.

  2. I have an idea about a new method for cooling data center servers. It takes more space, but it reduces electricity, heat and noise. I would sell it for 1 billion/milliard €.
    So ask yourself, is it worth:
    – takes more space
    + less electricity
    + less heat
    + less noise
    Think about that, I would say, it is worth.

  3. Google,
    People have a trick for getting unlimited google drive storage, which is making multiple accounts, making a folder and shares it with one primary account, basically each folder having 15 GB of storage

  4. How do we improve this. Multi-billion dollar miracle of engineering.

    Lets try to only make it give us answers we agree with.

    So that the people we give money to will get elected.

  5. How can Google get all this data?
    Is there the latest data or only historical records? I appreciate the google search for latest records limiting the old records appearing. Search is very fast, but often times not accurate or gives wrong point of views – which would be right if that was right point of view. The search speed definitely higher than my hard disc search speed.

  6. I’d like to enter and remove the program that, when I enter “white couples” into the search bar, pictures of black or mixed couples appear. That’s a… different result.

  7. very nice,holy moly.lol. cooling freon is mabey better.harddisk,no sd card?ooops. i have let rebuild mine comp.now everyting is 4K,not mine webcam. monitor is the samsung uhd 750. very coool. new prosessor and cool fans.i need better.cost €650-. total.

  8. So on a different note I like how amazon is putting commercials out about how “happy” employees are at their distribution centers when they’re not happy at all

  9. 4:30 really wish they used the hot water to heat something else rather than wasting the free energy and then pumping more energy into cooling it. Still a very impressive centre though…

  10. I love this kind of videos. It shows what happens behind the scenes that alot of people don't know about and took it for granted. Such stuff happening all around the world but you still see people complaining any minor inconvenience..

Leave a Reply

Your email address will not be published. Required fields are marked *