Cloud Portfolio Updates with Hitachi Content Platform (HCP) and Hitachi Data Ingestor (HDI)

Hitachi Data Systems (HDS) has introduced an updated version of Hitachi Content Platform (HCP) and a new product called Hitachi Data Ingestor (HDI). Last week, I had a chance to have a briefing on these two new offerings. These hit the wire today, here is HDS’s press release for the news.

Here are the key factors of of HCP 4 and HDI:

  • This is targeted to the large enterprise for private cloud scenarios
  • This is targeted to the cloud provider/co-location provider that doesn’t write their own APIs
  • This integrates with existing HDS storage products, including the newly announced Virtual Storage Platform (VSP).
  • Up to 40PB of usable capacity in a single physical cluster (VSP supports up to 255 PB, however)
  • Technical features (Copied from pre-release materials):
    • Multitenant
    • Intelligent objects
    • Chargeback
    • Compression and deduplication
    • Encryption of data at rest
    • Authentication
    • WORM
    • Compliance and retention
    • Built-in protection, preservation and replication
    • High Availability architecture
    • Continuous data integrity checking
    • Advanced replication and disaster recovery

    This figure captures HCP in a quick picture well:

image

While HCP is one technology in the ever-changing cloud technology environment, what really caught my eye was the new HDI component. Basically, HDI is  a cluster (initially/oddly of two Dell Servers) that provide CIFS and NFS endpoints to an HCP engine. This is entirely multi-tenant aware as well. The figure below shows a few features and how HDI fits into the HCP environment:

image

The small cluster component of general purpose servers for HDI also functions as a cache. There is 4TB of disk space in the HDI component to provide increased performance on the front-end of HCP. There are other features within HDI such as replicate on write and a network of stub-pointers to HCP.

I’m working to get a screen shot of the CIFS/NFS engines for HDI – as I believe that if that is an intuitive and seamless interface, the potential could be huge. More info is at the HCP homepage.

Advertisements

4 responses to this post.

  1. […] This post was mentioned on Twitter by Hitachi Data Systems, Don Jennings. Don Jennings said: (Shared) Cloud Portfolio Updates with Hitachi Content Platform (HCP) & Hitachi Data Ingestor (HDI) (via @RickVanover), http://bit.ly/95ICma […]

    Reply

  2. Good day! This post couldn’t be written any better! Reading through this post reminds me of my previous room mate! He always kept chatting about this. I will forward this write-up to him. Fairly certain he will have a good read. Many thanks for sharing!

    Reply

  3. An outstanding share! I have just forwarded
    this onto a coworker who was conducting a little research on this.
    And he in fact ordered me lunch because I stumbled upon
    it for him… lol. So let me reword this…. Thanks for the meal!
    ! But yeah, thanx for spending the time to
    discuss this issue here on your web page.

    Reply

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: