Breach Detection Systems - Zift Solutions

Report 28 Downloads 129 Views
 

BREACH  DETECTION  SYSTEMS  COMPARATIVE  ANALYSIS     Security  Value  Map  (SVM)     Thomas  Skybakmoen,  Jason  Pappalexis    

Tested  Products   AhnLab  MDS   Fidelis  XPS™  Direct  1000   FireEye  Web  MPS  4310  and  Email  MPS  5300   Fortinet  FortiSandbox  3000D   1

Sourcefire  (Cisco)  Advanced  Malware  Protection   Trend  Micro  Deep  Discovery  Inspector  Model  1000    

Environment   Breach  Detection  Systems:  Test  Methodology  1.5  

 

 

                                                                                                                                    1

 Sourcefire  is  now  part  of  Cisco.  

NSS  Labs  

Breach  Detection  Systems  Comparative  Analysis  –  SVM  

 

Overview   Empirical  data  from  the  individual  Product  Analysis  Reports  (PAR)  and  Comparative  Analysis  Reports  (CAR)  is  used   to  create  the  unique  Security  Value  Map  (SVM).  The  SVM  illustrates  the  relative  value  of  security  investment   options  by  mapping  security  effectiveness  and  value  (TCO  per  protected-­‐Mbps)  of  tested  product  configurations.     The  SVM  provides  an  aggregated  view  of  the  detailed  findings  from  NSS  Labs’  group  tests.  Individual  PARs  are   available  for  every  product  tested.  CARs  provide  detailed  comparisons  across  all  tested  products  in  the  areas  of:   • • •

Security   Performance   Total  cost  of  ownership  (TCO)  

Breach Detection Systems (BDS) Security Value Map™ 100%

Fortinet

Trend Micro

Sourcefire (Cisco)

99%

Fidelis

98%

Average

97%

96%

95% AhnLab

FireEye

94%

93%

92%

Average

91%

$500

$450

$400

$350

90% $300

$250

$200

$150

$100

$50

$0

TCO per Protected-Mbps

AhnLab MDS

FireEye Web MPS 4310 and Email MPS 5300

Sourcefire (Cisco) Advanced Malware Protection

Fidelis XPS™ Direct 1000

Fortinet FortiSandbox 3000D

Trend Micro Deep Discovery Inspector Model 1000

  Figure  1  –  NSS  Labs'  Security  Value  Map™  (SVM)  for  Breach  Detection  Systems  (BDS)    

 

2  

 

 

NSS  Labs  

Breach  Detection  Systems  Comparative  Analysis  –  SVM  

 

Key  Findings   • • • • •

Overall  security  effectiveness  ranged  from  94.5%  to  99.1%,  with  4  of  the  6  tested  products  achieving  greater   than  95%.     TCO  per  Protected-­‐Mbps  ranged  from  USD  $232  to  USD  $469,  with  most  tested  devices  below  USD  $310  per   Protected-­‐Mbps.   NSS-­‐Tested  Throughput  ranged  from  667  Mbps  to  1,000  Mbps,  with  5  of  the  6  tested  products  achieving  1,000   Mbps. Average  Security  Effectiveness  rating  was  97.5%  –  4  devices  were  rated  as  above  average  security   effectiveness,  2  were  below  average. Average  Value  (TCO  per  Protected-­‐Mbps)  was  $330.79  –  4  devices  were  rated  as  above  average  value  and  2   were  below  average.

Product  Rating   The  Overall  Rating  in  the  following  table  is  determined  based  on  which  SVM  quadrant  the  product  falls  within  –   Recommended  (top  right),  Neutral  (top  left  or  bottom  right)  or  Caution  (bottom  left).  For  more  information  on  how   the  SVM  is  constructed,  please  see  “How  To  Read  The  SVM”  section  in  this  document.     Product  

 Security  Effectiveness  

 Value  (TCO  per  Protected-­‐Mbps)  

Overall  Rating  

AhnLab  MDS  

94.7%  

Below  Average  

$468.78    

Below  Average  

Caution  

Fidelis  XPS™  Direct  1000  

98.4%  

Above  Average  

$306.89    

Above  Average  

Recommended  

FireEye  Web  MPS  4310  and  Email  MPS  5300  

94.5%  

Below  Average  

$427.85    

Below  Average  

Caution  

Fortinet  FortiSandbox  3000D  

99.0%  

Above  Average  

$309.24    

Above  Average  

Recommended  

Sourcefire  Advanced  Malware  Protection  

99.0%  

Above  Average  

$231.86  

Above  Average  

Recommended  

Trend  Micro  Deep  Discovery  Inspector  Model  1000  

99.1%  

Above  Average  

$240.11    

Above  Average  

Recommended  

Figure  2  –  NSS  Labs'  Recommendations  for  Breach  Detection  Systems  (BDS)  

This  report  is  part  of  a  series  of  Comparative  Analysis  Reports  (CAR)  on  security,  performance,  total  cost  of   ownership  (TCO)  and  Security  Value  Map  (SVM).  In  addition,  a  SVM  Toolkit  is  available  to  NSS  clients  that  allows   for  the  incorporation  of  organization-­‐specific  costs  and  requirements  to  create  a  completely  customized  SVM.  For   more  information,  please  visit  www.nsslabs.com.    

 

 

 

3  

 

 

NSS  Labs  

Breach  Detection  Systems  Comparative  Analysis  –  SVM  

 

Table  of  Contents:   Tested  Products  ......................................................................................................................  1   Environment  ...........................................................................................................................  1   Overview  ................................................................................................................................  2   Key  Findings  ..................................................................................................................................................  3   Product  Rating  ..............................................................................................................................................  3   How  to  Read  the  SVM  .............................................................................................................  5   Analysis  ..................................................................................................................................  7   Recommended  .............................................................................................................................................  7   Fidelis  XPS™  Direct  1000  ...........................................................................................................................  7   Fortinet  FortiSandbox  3000D  ....................................................................................................................  7   Sourcefire  Advanced  Malware  Protection  ................................................................................................  7   Trend  Micro  Deep  Discovery  Inspector  Model  1000  .................................................................................  8   Neutral  .........................................................................................................................................................  8   Caution  .........................................................................................................................................................  8   AhnLab  MDS  .............................................................................................................................................  8   FireEye  MPS  (FireEye  Web  MPS  4310  and  Email  MPS  5300)  ....................................................................  8   Test  Methodology  ...................................................................................................................  9   Contact  Information  ................................................................................................................  9    

Table  of  Figures   Figure  1  –  NSS  Labs'  Security  Value  Map™  (SVM)  for  Breach  Detection  Systems  (BDS)  ...............................................  2   Figure  2  –  NSS  Labs'  Recommendations  for  Breach  Detection  Systems  (BDS)  ..............................................................  3   Figure  3  –  Example  SVM  ...............................................................................................................................................  5      

 

 

4  

 

 

NSS  Labs  

Breach  Detection  Systems  Comparative  Analysis  –  SVM  

 

How  to  Read  the  SVM   The  SVM  depicts  the  value  of  a  typical  deployment  of  four  (4)  devices  plus  a  central  management  unit  (and  where   necessary,  a  log  aggregation,  and/or  event  management  unit),  to  provide  a  more  accurate  reflection  of  cost  versus   that  of  a  single  BDS  device.  An  example  SVM  is  shown  in  Figure  3. 100%'

Vendor'C'

Vendor'B'

Vendor'A' Vendor'D'

Vendor'F'

95%'

Average'

Security)Effec,veness))

90%'

Vendor'E' 85%'

Average'

80%'

75%' $60''

$50''

$40''

$30''

$20''

$10''

$0''

TCO)per)Protected7Mbps)

 

Figure  3  –  Example  SVM  

The  x-­‐axis  charts  the  TCO  per  Protected-­‐Mbps,  a  metric  that  incorporates  the  3-­‐Year  TCO  with  measured   throughput  to  provide  a  single  result  that  can  be  used  to  compare  the  actual  Value  of  each  product  tested.  The   terms  TCO  per  Protected-­‐Mbps  and  Value  are  used  interchangeably  throughout  these  reports.   The  y-­‐axis  charts  the  security  effectiveness  as  measured  via  the  security  effectiveness  tests.  Devices  that  are   missing  critical  security  capabilities  will  have  a  reduced  score  on  this  axis.      

 

 

 

5  

 

 

NSS  Labs  

Breach  Detection  Systems  Comparative  Analysis  –  SVM  

  Mapping  the  data  points  against  the  Security  Effectiveness  and  TCO  per  Protected-­‐Mbps  results  in  four  quadrants   on  the  SVM.     •





Farther  up  and  to  the  right  is  recommended  by  NSS.  The  upper-­‐right  quadrant  contains  those  products  that   are  in  the  Recommended  category  for  both  security  effectiveness  and  TCO  per  Protected-­‐Mbps.  These   products  provide  a  very  high  level  of  detection  and  value  for  money. Farther  down  and  to  the  left  should  be  used  with  caution.  The  lower  left  quadrant  would  comprise  the  NSS   Caution  category;  these  products  offer  limited  value  for  money  given  the  3-­‐Year  TCO  and  measured  security   effectiveness  rating. The  remaining  two  quadrants  comprise  the  NSS  Neutral  category.  These  products  may  still  be  worthy  of  a   place  on  an  organization’s  short  list  based  on  its  specific  requirements.

For  example,  products  in  the  upper-­‐left  quadrant  score  as  above  average  for  security  effectiveness,  but  below   average  for  value  (TCO  per  Protected-­‐Mbps).  These  products  would  be  suitable  for  environments  requiring  a  high   level  of  detection,  albeit  at  a  higher  than  average  cost. Conversely,  products  in  the  lower-­‐right  quadrant  score  as  below  average  for  security  effectiveness,  but  above   average  for  value  (TCO  per  Protected-­‐Mbps).  These  products  would  be  suitable  for  environments  where  budget  is   paramount,  and  a  slightly  lower  level  of  detection  is  acceptable  in  exchange  for  a  lower  cost  of  ownership.   In  all  cases,  the  SVM  should  only  be  a  starting  point.  NSS  clients  have  access  to  the  SVM  Toolkit,  which  allows  for   the  incorporation  of  organization-­‐specific  costs  and  requirements  to  create  a  completely  customized  SVM.   Furthermore,  the  option  is  available  to  schedule  an  inquiry  call  (or  a  written  response)  with  the  analysts  involved  in   the  actual  testing  and  report  production.      

 

 

 

6  

 

 

NSS  Labs  

Breach  Detection  Systems  Comparative  Analysis  –  SVM  

 

Analysis   The  analysis  is  divided  into  three  categories  based  on  the  position  of  each  product  in  the  SVM:  Recommended,   Neutral,  and  Caution.  Each  of  the  tested  products  will  fall  into  only  one  category,  and  vendors  are  listed   alphabetically  within  each  section.  

Recommended   Fidelis  XPS™  Direct  1000   Key  Findings:   • • •

Fidelis  XPS™  was  tested  and  rated  by  NSS  at  1,000  Mbps,  which  is  in  line  with  the  vendor’s  claim  (Fidelis  rates   this  device  at  1,000  Mbps.)   Fidelis  XPS™  detected  98%  of  HTTP  malware,  98%  of  email  malware,  and  100%  of  exploits,  giving  an  overall   breach  detection  rating  of  98.4%.  The  device  passed  all  six  stability  and  reliability  tests.   Fidelis  XPS™  had  a  0%  false  positive  rate  as  tested.  

Fortinet  FortiSandbox  3000D     Key  Findings:   • • •

The  Fortinet  FortiSandbox  3000D  was  tested  and  rated  by  NSS  at  1,000  Mbps,  which  is  in  line  with  the   vendor’s  claim  (Fortinet  rates  this  device  at  1,000  Mbps.)   The  Fortinet  FortiSandbox  3000D  detected  99%  of  HTTP  malware,  98%  of  email  malware,  and  100%  of   exploits,  giving  an  overall  breach  detection  rating  of  99.0%.  The  device  passed  all  stability  and  reliability  tests.   The  FortiSandbox  3000D  had  a  0%  false  positive  rate  as  tested.  

Sourcefire  Advanced  Malware  Protection     Key  Findings:   • • • •  

 

Sourcefire  Advanced  Malware  Protection  was  tested  and  rated  by  NSS  at  1,000  Mbps,  which  is  in  line  with  the   vendor’s  claim  (Sourcefire  rates  this  device  at  1,000  Mbps).   Sourcefire  Advanced  Malware  Protection  detected  99%  of  HTTP  malware,  98%  of  email  malware,  and  100%  of   exploits,  giving  an  overall  breach  detection  rating  of  99.0%.  The  device  passed  all  stability  and  reliability  tests.   Sourcefire  Advanced  Malware  Protection  had  a  0%  false  positive  rate  as  tested.   Sourcefire  solution  under  test  includes  both  a  sensor  appliance  and  agents.  

 

7  

 

 

NSS  Labs  

Breach  Detection  Systems  Comparative  Analysis  –  SVM  

  Trend  Micro  Deep  Discovery  Inspector  Model  1000   Key  Findings:   • •



The  Trend  Micro  Deep  Discovery  Inspector  Model  1000  was  tested  and  rated  by  NSS  at  1,000,  which  is  in  line   with  the  vendor’s  claim  (Trend  Micro  rates  this  device  at  1,000  Mbps).   The  Trend  Micro  Deep  Discovery  Inspector  Model  1000  detected  97%  of  HTTP  malware,  100%  of  email   malware,  and  100%  of  exploits,  giving  an  overall  breach  detection  rating  of  99.1%.  The  device  passed  all   stability  and  reliability  tests.     The  Deep  Discovery  Inspector  Model  1000  had  a  0%  false  positive  rate  as  tested.  

Neutral   No  vendors  were  rated  as  neutral  for  this  test.  

Caution   AhnLab  MDS     Key  Findings:   • • •

AhnLab  MDS  was  tested  and  rated  by  NSS  at  1,000  Mbps,  which  is  in  line  with  the  vendor’s  claim  (AhnLab   rates  this  device  at  1,000  Mbps).     AhnLab  MDS  detected  100%  of  HTTP  malware,  94%  of  email  malware,  and  90%  of  exploits,  giving  an  overall   breach  detection  rating  of  94.7%.  The  device  passed  all  stability  and  reliability  tests.     Ahnlab  MDS  misidentified  7%  of  legitimate  traffic  as  malicious  (false  positives).  

FireEye  MPS  (FireEye  Web  MPS  4310  and  Email  MPS  5300)   Key  Findings:   • • •

 

The  FireEye  MPS  was  tested  and  rated  by  NSS  at  667  Mbps,  which  is  higher  than  that  claimed  by  the  vendor   (FireEye  rates  these  devices  at  250  Mbps).   The  FireEye  MPS  detected  95%  of  HTTP  malware,  96%  of  email  malware,  and  93%  of  exploits,  giving  an  overall   breach  detection  rating  of  94.5%.  The  MPS  passed  all  stability  and  reliability  tests.   FireEye  MPS  had  a  0%  false  positive  rate  as  tested.      

 

8  

 

 

NSS  Labs  

Breach  Detection  Systems  Comparative  Analysis  –  SVM  

 

Test  Methodology   Breach  Detection  Systems:  Test  Methodology  1.5   A  copy  of  the  test  methodology  is  available  on  the  NSS  Labs  website  at  www.nsslabs.com    

Contact  Information   NSS  Labs,  Inc.   206  Wild  Basin  Rd   Buliding  A,  Suite  200   Austin,  TX  78746   +1  (512)  961-­‐5300   [email protected]   www.nsslabs.com      

  This  and  other  related  documents  available  at:  www.nsslabs.com.  To  receive  a  licensed  copy  or  report  misuse,   please  contact  NSS  Labs  at  +1  (512)  961-­‐5300  or  [email protected].     ©  2014  NSS  Labs,  Inc.  All  rights  reserved.  No  part  of  this  publication  may  be  reproduced,  photocopied,  stored  on  a  retrieval   system,  or  transmitted  without  the  express  written  consent  of  the  authors.     Please  note  that  access  to  or  use  of  this  report  is  conditioned  on  the  following:   1.  The  information  in  this  report  is  subject  to  change  by  NSS  Labs  without  notice.   2.  The  information  in  this  report  is  believed  by  NSS  Labs  to  be  accurate  and  reliable  at  the  time  of  publication,  but  is  not   guaranteed.  All  use  of  and  reliance  on  this  report  are  at  the  reader’s  sole  risk.  NSS  Labs  is  not  liable  or  responsible  for  any   damages,  losses,  or  expenses  arising  from  any  error  or  omission  in  this  report.   3.  NO  WARRANTIES,  EXPRESS  OR  IMPLIED  ARE  GIVEN  BY  NSS  LABS.  ALL  IMPLIED  WARRANTIES,  INCLUDING  IMPLIED   WARRANTIES  OF  MERCHANTABILITY,  FITNESS  FOR  A  PARTICULAR  PURPOSE,  AND  NON-­‐INFRINGEMENT  ARE  DISCLAIMED  AND   EXCLUDED  BY  NSS  LABS.  IN  NO  EVENT  SHALL  NSS  LABS  BE  LIABLE  FOR  ANY  CONSEQUENTIAL,  INCIDENTAL  OR  INDIRECT   DAMAGES,  OR  FOR  ANY  LOSS  OF  PROFIT,  REVENUE,  DATA,  COMPUTER  PROGRAMS,  OR  OTHER  ASSETS,  EVEN  IF  ADVISED  OF  THE   POSSIBILITY  THEREOF.   4.  This  report  does  not  constitute  an  endorsement,  recommendation,  or  guarantee  of  any  of  the  products  (hardware  or   software)  tested  or  the  hardware  and  software  used  in  testing  the  products.  The  testing  does  not  guarantee  that  there  are  no   errors  or  defects  in  the  products  or  that  the  products  will  meet  the  reader’s  expectations,  requirements,  needs,  or   specifications,  or  that  they  will  operate  without  interruption.     5.  This  report  does  not  imply  any  endorsement,  sponsorship,  affiliation,  or  verification  by  or  with  any  organizations  mentioned   in  this  report.     6.  All  trademarks,  service  marks,  and  trade  names  used  in  this  report  are  the  trademarks,  service  marks,  and  trade  names  of   their  respective  owners.    

 

 

9