... maybe not.
While companies continue to pursue their relentless campaign of 'moving to the cloud' could they be overlooking their on-premise software position? (similarly, if you're not tracking and aligning your cloud consumption accurately you might be overpaying just the same, but lets just look into on-premise). At a recent (2022) webinar broadcast a former Oracle license management services (LMS) manager, Adi Ahuja, said that Oracle's audit has become "a sales enablement tool." Although Oracle states that their LMS "operates independently from any ongoing commercial discussions. Our services are delivered by a global team of highly experienced and knowledgeable consultants who collectively offer unrivaled knowledge on all aspects of Oracle's licensing policy." in practice there was a close relationship between sales and licence audits, Ahuja said. No surprise to anyone who has been subject to such an audit. But lets not single out Oracle - all of the majors undoubtedly co-ordinate an audit internally - bringing in the product team, finance, sales, and of course, the account team. Ok ... we get that, but how does the software baseline assist us in establishing what's really going on?Take the often cited audit line of 'we found a few things, but you'll just need to top up those products'. Easy enough - you buy the products in the renewal at your entitled price and all is good. What you're not potentially seeing though is the compliance cost as a result of those findings that has been built-in to your renewal fees. Compliance cost? The renewal fees look fine - what are we referring to? Well simply, you might have got a better price overall had you been able to breakdown where those costs came from, and that means having a costed baseline (ie. line-item level) that you can apply all of the adjustors to (inflation, price increases etc) and determine whether any 'additional' costs have covertly come in to play - aka, a compliance cost. Only then can you challenge the vendors assertion that 'you'll just need to top up those products' given what the baseline will tell you is how much backdating has been applied, whether the top-up was in fact at entitled price, and ultimately whether the overall renewal fee has been indexed reasonably at all. Consider the room your vendor has to move when you're faced with a multi-million dollar renewal - there are numerous places to 'hide' revenue pulls, and that doesn't change at lower levels, it just scales down. Establishing and maintaining a baseline can be something companies flinch at - they see it as just not worth the effort - by default then delegating this to their vendors, aka granting free rein to manipulate pricing as they see fit. So while it might take a concerted project (or how about an actual SAM practice!) to get going once established - and maintained in a purpose built system such as ComplianceWare - the overheads are much reduced and the benefits more easily returned. Further, it sends a convincing message to your vendors that you actively manage and are across your software landscape and commercial position, which makes them much more wary of any attempts to hoodwink you with a 'great renewal offer that puts any compliance issues to bed'!
0 Comments
The February 2021 edition of Microsofts Product Terms Document will be the last.A little under two years ago we reviewed Microsofts new approach to licensing terms in our June 2019 blog here - now its being further revamped. As announced on the front page of the February PT document: Please note this is the last Product Terms Word document. Going forward, the terms will be published on the Product Terms site available at https://www.microsoft.com/licensing/terms/productoffering. Archived versions will continue to be available. For more details, go to https://www.microsoft.com/Licensing/product-licensing/products. What does it look like - the landing page as shown below: So quite clear and compact, although you will need to be quite savvy with their license programs and models to get the most out of using it. ... and when put to the test?We decided to take on one of their more convoluted product licensing models - Power BI - and, well, it didn't seem any simpler. With prerequisites like "Power Automate per user with attended RPA plan, or Power Automate per flow plan" (ok...), and Extended Use Rights such as "Power Apps Portals that map to licensed Dynamics 365 application context and, Power Apps Portals that map to the same environment as the licensed Dynamics 365 application" (right...), the format might have changed but the content is still not that intuitive is it? So while access to dynamic and current licensing information is always a good thing, simpler licensing models and metrics would we think resonate much better with software customers in general. After all, we all want to be compliant, so why make it so hard we wonder - any thoughts / comments ?$$?
Could the Change to IBM's PVU Core Table Signal a Refreshing SHIFT in Sub-Capacity Licensing? While some vendors prefer to wallow in the mire of antiquated and irrelevant licensing regimes others seem to be moving ahead with revised models that provide clarity and ease in establishing your licensing and compliance position. A case in point - IBM - who flagged a rethink with a shift from the messy PVU to Virtual Processor Core metrics (example in the hyperlink). Starting April this year the x86 PVU Table has been culled down to just 6 entries with the Intel category now much simplified for the Xeon chipset, basically all determined by the number of sockets at 2, 4, and >4 (with the lower models in the listed ranges remaining at 50 PVU's): There is however one complication - Symmetric Multiprocessing Servers - which you need to factor per definition below: The PVU requirement for the Intel processor technology indicated is dependent on the maximum number of sockets on the server. If sockets on two or more servers are connected to form a Symmetric Multiprocessing (SMP) Server, the maximum number of sockets per server increases. Example:
Good news from our perspective - anything that removes ambiguity is welcomed (with reference to the linked post at the start of this blog: "oh but you have to count the Physical cores, not virtual, on the Host, in fact all Hosts in the complex, actually in the Data Center, well let's say the Cloud then, so basically ... ... everything, everywhere")
As the chatter of audits increases around the industry the range of reaction can be outright fear to mild anxiety, but ... sometimes - enthusiasm! What I hear you say - Enthusiasm?? Well yes surprisingly - for those organisations who run a well informed and skilled software / licensing function - it offers the prospect of evaluating just how effective their investment in processes and tools has been, and make any adjustments as/if necessary. Similarly, it provides an opportunity for objective feedback to management in a discipline that is otherwise difficult to gauge - just think - how can you quantify ROI without having a relative measure to report against? The contrary position - where organisations have no certainty at all of their compliance state - is not a great place to be and certainly does warrant some anxiety. Not only is there the likelihood of remediation (at $$?? cost), but when you don't have a position what can you actually contest? There's no doubt that the 'arms-length' engagement of external auditors allows just that much more vendor independence to put more onus on you the customer - the audit will deliver a straight deployment report, leaving it to you to clarify what might be chargeable, and what might not. Examples .... development software that might be free, supporting products under one suite that might be dispersed across servers, or even bundles - permitted, but unless qualified by you will still be listed as chargeable installs. So it's worth considering just where you are on the compliance scale. Ask yourself these three key questions:
If you do - great! - check that the processes are running as expected and you can take any impending audit on without that gut-wrenching fear and anxiety. If you're lacking on any front though some attention is warranted. Start with ownership - who will be responsible and held accountable for your software assets? Then, how will you keep a current and complete record across it all? You'll no doubt arrive at the conclusion you'll need tools to help do it all efficiently and effectively, so the question becomes - which tool is right for you? What features and functions do I really need? What price do I then want to pay for it? The very questions that led us to develop ComplianceWare - our full featured, cloud-based product designed to meet the needs of organisations who don't want those top-end highly integration reliant, distended suites offered by some more well known global providers. ComplianceWare offers just those essential functions in an easy to use web-based application such as software discovery and deployment reporting, customisation via configurations and conventions, and of course a contracts and license repository. And by delivering just the essentials we can offer a price to match - that is, the most cost effective solution you will find in the market. Try it as a one-off managed service (perhaps even using that audit data you've just been asked to provide) and evaluate on your own estate, or as a term license have access when and as you need it. Take a look at the Documentation or request a Demonstration to find out more, and then we're always here to help you out as needed!
It happens all too often - the assumption that a 'standard' licensing metric carries its 'standard' definition - only to find out (generally through an audit) that no, thats not the case. A classic example - the FTE metric. Simple right? Full Time Equivalent, standard 38 hours per week, calculated across the hours worked for the period by the organisation. Well as it turns out from some recent examples, perhaps not. Consider firstly, what constitutes the organisation? Does it extend to other entities in your company structure, ie. affiliates, joint ventures, perhaps franchisees? How that is interpreted could have a massive impact to the size of your FTE pool that needs to be measured. And if theres an automatic inclusion clause in the contract then as soon as an acquisition occurs you're liable even though they'll probably neither be intending to use the system or be operationally integrated for some time (remember that little innocuous checkbox in your Microsoft Enterprise Agreement?). And then what actually comprises 'hours worked'? Now surely thats straight-forward? Well as it turns out - No. It might just be that your vendor considers it to be all hours that are recorded in your system - annual leave, sick leave, even maternity and long-service leave. Or how about time spent volunteering, ie. time not even associated with your business - yes, that can be included too, to the extent that we have seen external influences captured under the definition as well, such as records related to CentreLink payments or involvement with the Defence Forces. So don't just assume that basic metric is what you expect it to be. Go through the contract definitions - and if its not there make sure it gets added, and as always it needs to be clear, complete, and unambiguous. If unsure, check some scenarios with your vendor and if its warranted have those included in the documentation as well. It can save you a lot of angst at a later time
How might it affect performance and licensing?As the full implications emerge with the design of Intel's (and potentially others) x64 processor (see more at the Register here) we await with interest a response from software vendors as to how the corresponding issue of licensing will be answered and resolved. Given patches are now being released (eg. AWS EC2 5th January, Azure the 10th January) the resultant performance impacts will become the subject of intense scrutiny. Why? well if, as reported, processing power diminishes anywhere from 5 to 30 percent how will customers be compensated? Processor and Core based software has been dutifully acquired on the basis of the underlying performance of the chipsets on which the products are run (consider IBM's PVUs, Microsofts Core minimums etc). Now though, if that proves to be erroneous, surely a remedy must be made available to the customer who has paid for a defined - and benchmarked - level of processing power? Take the scenario whereby a customers current 2,000 PVUs can no longer deliver the required throughput and needs a further 500 PVUs in order to deliver the same capability - you would be right to expect no additional charges to apply given there is no improvement in performance surely? And what about needing more hardware just to achieve the current level of demand, or a Cloud vendor purchasing an array of new servers in order to provision more vCPUs for their PaaS / SaaS customers just to meet the same CPU cycles ? That all costs money, so ... Who Pays?? Which then presents an intriguing conundrum for the chip makers and the software vendors. Presumably there will be a vast re-benchmarking exercise (and consider chipsets produced in the last 10 years are potentially affected) the question then being, what is to be done on the basis of the results? Compensation? Free License Grants? Reduced Annual Maintenance fees?? So we expectantly await vendor responses once the focus on getting fixes out shifts to the underlying and associated commercial dilemma confronting the industry. What can you do in the meantime? Firstly, make sure you have current performance metrics that you can measure any degradation against, and then pose these questions to your vendor Account Manager, your Sales Rep, your Software Specialist - ask how any performance issues you experience might be remediated in the immediate term, and request an open and regular communication channel to stay informed as it all progresses ... We firmly believe - if the projected performance impacts do transpire - this issue will prove to be one of the most perplexing problems to emerge in the IT industry in many years. Watch this space for further updates as more unfolds.
While these three little snippets might not seem particularly sensational they are worth noting precisely for that reason - they are likely lurking in the background, ready to cost you money when you least need it! OK, so we all know that under the IBM sub-capacity rules we must produce a report from ILMT every quarter right? And we know that we must sign and date that record, and keep them all as an artefact that may be required during any audit too, right? All good, then the tip: Make sure you have configured ILMT correctly and fully for VM Management. What's so important about VM Management in ILMT? If not properly configured it will default to 120 PVUs per core, so you could be over-reporting without being aware. How can you tell if its configured? Firstly, it shows a status on the Dashboard, and secondly, if not configured servers will be displayed with a serial-like number beginning with 'TLM_VM' or similar. If you need more information on how to configure just look here. Microsoft in many ways have led the industry in a shift to SaaS offerings backed by subscription based licensing. While this may appear to have a favourable ROI initially, there are other Time-Value commercial components to consider. Firstly, you need to be aware that your licensing is now not only visible but manageable real-time by Microsoft. So from a commercial perspective there is now no locked-in pricing for the typical 3 year term of an Enterprise Agreement, instead you will see price increases built-in year on year in your CPS. And more so, there is no 'True-Up' benefit whereby you would pay essentially half the cost in the year in which you deployed the product - you now 'reserve' the additional licenses you need to be drawn down, and you start paying from that month onwards. The tip? Make sure you consider TVM with subscription changes in your ROI / Cost Comparisons. And the last tip for 2017 ... a favourite topic here at Software Compliance ... processor to core conversion. Wrong! ... A license pack is applied to a server, so where you have say a 12 Core server you need to assign 6 x 2 Core packs - you can't assign 12 from your 16 Core pack, and then apply the other 4 elsewhere. A nasty - and potentially expensive error - if not properly considered in determining your conversion. And so ends 2017 ... we look forward to a busy and productive 2018 for us all!
Gone are the more simplistic days of Microsoft Per Processor licensing when there was a basic assignment of a single license to a processor, unlimited use and access, all available across multiple editions of software. Indeed, Microsoft were touting per processor as a major point of difference looking back to even SQL Server 2008, going as far as to claim ‘thought leadership’ when it came to competitor licensing models aligned to multicore processors. From 2016 though (and noting the GA of SQL Server 2017 from October 2017), following their conformance and gradual demise of the processor metric, there are now primarily three Per Core licensing models:
So let’s take a look at 2 more common server products afflicted by this change, SQL Server under (1) and Windows Server under (2) – and if you are intending to use Self-Hosting or SPLA rights note that there are further considerations not covered here, the context of this blog contained to licensing acquired under Microsoft’s Volume Licensing offerings (refer: Microsoft Commercial Licensing) SQL Server 2016 With SQL Server 2016 Per Core licensing, each server running software or any of its components (such as Reporting Services or Integration Services) must be assigned an appropriate number of SQL Server 2016 core licenses. The number of core licenses needed depends on whether you are licensing the physical server or individual virtual operating system environments (OSEs), across either edition. Unlike the Server+CAL licensing model, the Per Core model allows access for an unlimited number of users or devices to connect from either inside or outside an organisation’s firewall. With the Per Core model, you do not need to purchase client access licenses (CALs) to access the SQL Server software. When running SQL Server in a physical OSE, all physical cores on the server must be licensed, noting software partitioning does not reduce the number of core licenses required except when licensing individual virtual machines (VMs). A minimum of four core licenses are required for each physical processor on the server, with the use of hyper-threading not affecting the number of core licenses required when running in a physical OSE, only those licensed under individual virtual machines (which are still subject to the four core minimum). So with the basics understood you’ll then want to familiarise with what you gain with the addition of a Software Assurance subscription… Key SQL Server SA Benefits
And be cautious – the components of a SQL Server license cannot be separated. While management tools and other software identified as additional or supplemental software such as product documentation, client connectivity tools, software add-ins, and Software Development Kits (SDKs) can generally be distributed and run on any number of devices for use with a licensed instance of SQL Server software, other licensed components such as the SQL Server Database Engine (DB), SQL Server Services for Windows, Master Data Services (MDS), Analysis Services (AS), Integration Services (IS), Reporting Services (RS), and Data Quality Services (DQS) will require licensing if deployed to other servers. You can find more details of the components at: SQL Server Software Components And for Non-Production: Effective April 1, 2016, SQL Server Developer Edition became a free product, available for download from the Microsoft Dev Essentials program as a potential alternative to the likes of a Visual Studio subscription. SQL Server 2016 Developer Edition is a fully featured version of SQL Server software—including all of the features and capabilities of Enterprise Edition--licensed for development, test, and demonstration purposes only. Windows Server 2016 For both Standard and Datacenter editions, the number of core licenses required equals the number of physical cores on the licensed server, subject to a minimum of 8 core licenses per physical processor and a minimum of 16 core licenses per server (sold in 2-Core and 16-Core packs). Which means being very careful when you tally your overall requirement – make sure you account for the 16-Core per server minimum across any single CPU servers you might have in your inventory where you might otherwise under-allocate (where <16). And the differences in the editions?
(and if you're looking for a definition of containers, look no further - go here) So, To Finish … Bear in mind that in both cases you will still need to account for the CAL requirements if necessary (and remember to count all direct and indirect users/devices, ie. no multiplexing), typically via the likes of a Core CAL Suite or equivalent, which as an example provides the following licenses:
Noting that the likes of SQL Server CALs and Dynamics/CRM CALs must be acquired separately. Microsoft have provided a very helpful Licensing Brief across Core Licensing that I would also recommend reading for more information.
IBM first introduced sub-capacity licensing in 2005 in response to the production of the x86 dual core chipset on the premise that licensing could shift to each core on a chip, not just the chipset itself. Clearly this has merit given today's much more complex, virtualised technology platforms with chips containing up to 18 cores - the enduring problem though for customers: understanding the sub-capacity rules and staying compliant. So lets take a look at one of the most common IBM sub-capacity metrics - the Processor Value Unit or PVU for middleware. Firstly, lets be clear on IBM's terminology relevant to this discussion: Core - A functional unit within a computing device that interprets and executes software instructions. Chip – electronic circuitry, containing but not limited to at least one core, on a silicon wafer. Socket – the mount that secures a chip to a motherboard. Processor – There remains disagreement in the computer industry over the definition of a processor. IBM defines a processor as the core. For example, a dual-core chip has two processor cores on it. As an IBM customer there are certain prerequisites to employing sub-capacity licensing under your Passport Advantage (PA) Agreement:
So with that all ticked off (and make sure they are - you will need these artefacts in the event of any audit), what are the basics in determining the number of licenseable PVUs?
If counting physical cores you count Activated Processor Cores, which are processor cores that are available for use in the server (ie. don't count if deactivated). The illustration below provides an example of the counting rules applied in an x86 environment where the Virtualisation Capacity equates to the number of licenseable cores: So with the number of cores established, how do we determine the applicable PVU count? IBM assigns a PVU per processor core rating to each processor technology collated in the PVU Table published on their website - and note just how much has changed when you compare to the original table from 2006 below (!) You'll need the requisite system access in order to interrogate your systems to determine the relevant processor model - refer to IBM's helpful Guide if you need more information on how to do this (or, let our ComplianceWare tool do it all for you!)
Its then a matter of extrapolating the PVU counts by cores across your product installations, and tracking on a regular basis to be sure you account for the inevitable changes in your environment that alter those figures (again, easily done through system compares of our ComplianceWare output). IBM has published a list of FAQ that should assist you through any other queries you might have, or ... comment here and we'll be happy to help! |
<
>
Archives
November 2023
|